Who Owns ‘Truth’ in the Age of Educational GenAI?

As generative AI becomes more deeply embedded in digital education, it no longer simply delivers knowledge; it shapes it. What counts as truth, and whose truth is represented, becomes increasingly complex. Rather than offering fixed answers, this piece challenges educational technologists to confront the ethical tensions and contextual sensitivities that now define digital learning.

Who Owns ‘Truth’ in the Age of Educational GenAI?

Author: Prof. John Traxler, UNESCO Chair, Commonwealth of Learning Chair and Academic Director of the Avallain Lab

St. Galen, May 23, 2025 – Idealistically, perhaps, teaching and learning are about sharing truths, and sharing facts, values, ideas and opinions. Over the past three decades, digital technology has been increasingly involved or implicated in teaching and learning, and increasingly involved or implicated in shaping the truths, the facts, values, ideas and opinions that are shared. Truth seems increasingly less absolute, stable and reliable and digital technology seems increasingly less neutral and passive.

The emergence of powerful and easily available AI, both inside education and in the societies outside it, only amplifies and accelerates the instabilities and uncertainties around truth, making it far less convincing for educational digital technologists to stand aside, hoping that research or legislation or public opinion will understand the difficulties and make the rules. This piece unpacks these sometimes controversial and uncomfortable propositions, providing no easy answers but perhaps clarifying the questions.

Truth and The Digital

Truth is always tricky. It is getting trickier and trickier, and faster and faster. We trade in truth, we all trade in truth; it is the foundation of our communities and our companies, our relationships and our transactions. It is the basis on which we teach and learn, we understand and we act. And we need to trust it.

The last two decades have, however, seen the phrases ‘fake news’ and ‘post truth’ used to make assertions and counter assertions in public spheres, physical and digital, insidiously reinforcing the notion that truth is subjective, that everyone has their own truth. It just needs to be shouted loudest. These two decades also saw the emergence and visibility of communities, big and small, in social media, able to coalesce around their own specific beliefs, their own truths, some benign, many malign, but all claiming their adherents to be truths. 

The digital was conceived ideally as separate and neutral. It was just the plumbing, the pipes and the reservoirs that stored and transferred truths, from custodian or creator to consumers, from teacher to learner. Social media, intrusive, pervasive and universal, changed that, hosting all those different communities.

The following selection of assertions comprises some widely accepted truths, though this will always depend on the community; others are generally recognised as false and some, the most problematic, generate profound disagreement and discomfort.

  • The moon is blue cheese, the Earth is flat
  • God exists
  • Smoking is harmless
  • The holocaust never happened 
  • Prostate cancer testing is unreliable
  • Gay marriage is normal 
  • Climate change isn’t real 
  • Evolution is fake
  • Santa Claus exists 
  • Assisted dying is a valid option
  • Women and men are equal
  • The sun will rise
  • Dangerous adventure sports are character-building
  • Colonialism was a force for progress

These can all be found on the internet somewhere and all represent the data upon which GenAI is trained as it harvests the world’s digital resources. Whether or not each is conceived as true depends on the community or culture.

Saying, ‘It all depends on what you mean by …’ ignores the fundamental issue, and yes, some may be merely circular while others may allow some prevarication and hair-splitting, but they all exist. 

Educational GenAI

In terms of the ethics of educational AI, extreme assertions like the ‘sun will rise’ or ‘the moon is blue cheese’ are not a challenge. If a teacher wants to use educational GenAI tools to produce teaching materials that make such assertions, the response is unequivocal; it is either ‘here are your teaching materials’ or ‘sorry, we can’t support you making that assertion to your pupils’.   

Where educational AI needs much more development is in dealing with assertions which, for us, may describe non-controversial truths, such as ‘women and men are equal’ and ‘gay marriage is normal’, but which may be met by different cultures and communities with violently different opinions.

GenAI harvests the world’s digital resources, regurgitating them as plausible, and in doing so, captures all the prejudice, biases, half-truths and fake news already out there in those digital resources. The role of educational GenAI tools is to mediate and moderate these resources in the interests of truth and safety, but we argue that this is not straightforward. If we know more about learners’ culture and contexts and their countries, we are more likely to provide resources with which they are comfortable, even if we are not. 

Who Do We Believe?

Unfortunately, some existing authorities that might have helped, guided and adjudicated these questions are less useful than previously. The speed and power of GenAI have overwhelmed and overtaken them. 

Regulation and guidance have often mixed pre-existing concerns about data security with assorted general principles and haphazard examples of their application, all focused on education in the education system rather than learning outside it. The education system has, in any case, been distracted by concerns about plagiarism and has not yet addressed the long-term issues of ensuring school-leavers and graduates flourish and prosper in societies and economies where AI is already ubiquitous, pervasive, intrusive and often unnoticed. In any case, the treatment of minority communities or cultures within education systems may itself already be problematic.

Education systems exist within political systems. We have to acknowledge that digital technologies, including educational digital technologies, have become more overtly politicised as global digital corporations and powerful presidents have become more closely aligned.

Meanwhile, the conventional cycle of research funding, delivery, reflection and publication is sluggish compared to developments in GenAI. Opinions and anecdotes in blogs and media have instead filled the appetite for findings, evaluations, judgments and positions. Likewise, the conventional cycle of guidance, training, and regulation is slow, and many of the outputs have been muddled and generalised. Abstract theoretical critiques have not always had a chance to engage with practical experiences and technical developments, often leading to evangelical enthusiasm or apocalyptic predictions. 

So, educational technologists working with GenAI may have little adequate guidance or regulation for the foreseeable future.

Why is This Important?

Educational technologists are no longer bystanders, merely supplying and servicing the pipes and reservoirs of education. Educational technologists have become essential intermediaries, bridging the gap between the raw capabilities of GenAI, which are often indiscriminate, and the diverse needs, cultures and communities of learners. Ensuring learners’ safe access to truth is, however, not straightforward since both truth and safety are relative and changeable, and so educational technologists strive to add progressively more sensitivity and safety to truths for learners. 

At the Avallain Lab, aligned with Avallain Intelligence, our broader AI strategy, we began a thorough and ongoing programme of building ethics controls that identify what are almost universally agreed to be harmful and unacceptable assertions. We aim to enhance our use of educational GenAI in Avallain systems to represent our core values, while recognising that although principles for trustworthy AI may be universal, the ways they manifest can vary from context to context, posing a challenge for GenAI tools. This issue can be mitigated through human intervention, reinforcing the importance of teachers and educators. Furthermore, GenAI tools must be more responsive to local contexts, a responsibility that lies with AI systems deployers and suppliers. While no solution can fully resolve society’s evolving controversies, we are committed to staying ahead in anticipating and responding to them.

About Avallain

At Avallain, we are on a mission to reshape the future of education through technology. We create customisable digital education solutions that empower educators and engage learners around the world. With a focus on accessibility and user-centred design, powered by AI and cutting-edge technology, we strive to make education engaging, effective and inclusive.

Find out more at avallain.com

About TeacherMatic

TeacherMatic, a part of the Avallain Group since 2024, is a ready-to-go AI toolkit for teachers that saves hours of lesson preparation by using scores of AI generators to create flexible lesson plans, worksheets, quizzes and more.

Find out more at teachermatic.com

Contact:

Daniel Seuling

VP Client Relations & Marketing

dseuling@avallain.com

How Can We Navigate the Path to Truly Inclusive Digital Education?

True inclusivity in digital education demands more than good intentions. Colonial legacies still influence the technologies and systems we use today. As we embrace AI, we must consider whether it truly serves all learners or if it carries the biases of the past along with the impact of digital neo-colonialism in education. Drawing on work commissioned by UNESCO and discussions across UK universities, this is an opportunity to recognise hidden influences and ultimately create a fairer and more equitable digital learning environment.

How Can We Navigate the Path to Truly Inclusive Digital Education?

Author: John Traxler, UNESCO Chair, Commonwealth of Learning Chair and Academic Director of the Avallain Lab

What Are We Talking About?

St. Galen, April 25, 2025 – This blog draws on work commissioned by UNESCO, to be published later in the year1, and on webinars across UK universities. Discussions about decolonising educational technology have formed part of initiatives in universities globally, alongside those about decolonising the curriculum, as part of the ‘inclusion, diversity and equity’ agenda, and in the wider world, alongside movements for reparations2 and repatriation3

This blog was written from an English perspective. Other authors would write it differently.

Decolonising is a misleadingly negative-sounding term. The point of ‘decolonising’ is often misunderstood to be merely remediation, undoing the historical wrongs to specific communities and cultures and then making amends. Yes, it is those things, but it is also about enriching the educational experience of all learners, helping them understand and appreciate the richness and diversity of the world around them.

Colonialism is not limited to the historical activities of British imperialists or even European ones. Tsarist Russia, Soviet Russia, Imperial China, Communist China and Ottoman Turkey are all examples. It remains evident within the one-time coloniser nations and the one-time colonised; Punjabi communities in the English West Midlands and Punjab itself, both still living with the active legacies of an imperial past. It is present in legacy ex-colonial education systems, in the ‘soft power’ of the Alliance Française, the Voice of America, the Goethe Institute, the British Council, the Instituto Cervantes, the World Service, the Peace Corps and the Confucius Institutes, and is now resurgent as the digital neo-colonialism of global corporations headquartered in Silicon Valley.

Why does it matter? It matters because it is an issue of justice and fairness, of right and wrong, and it matters to policy-makers, teachers, learners, employers, companies and the general public as a visible and emotive issue.

What About Educational Technology?

How is it relevant to educational technology? Firstly, ‘educational technology’ is only the tip of the iceberg in terms of how people learn with digital technology. People learn casually, opportunistically and unsupported, driven by momentary curiosity, self-improvement and economic necessity. They do so outside systems of formal instruction. Decolonising ‘educational technology’ may be easier and more specific than decolonising the digital technologies of informal learning, but they have many technologies in common.

At the most superficial level, the interactions and interfaces of digital technologies are dominated by images that betray their origins through visual metaphors such as egg-timers, desktops, files, folders, analogue clocks, wastepaper bins, gestures like the ‘thumbs up’ and cultural assumptions such as green meaning ‘go’. These technologies often default to systems and conventions shaped by history, such as the Gregorian calendar, the International Dateline, Mercator projections, Imperial weights and measures (or Système Internationale) and naming conventions like Far East, West Indies and Latin America. They also tend to prioritise the colonial legacies, European character sets, American spelling and left-to-right, top-to-bottom typing. 

Speech recognition still favours the global power languages and their received pronunciation, vocabulary and grammar. Other languages and dialects only come on stream slowly; likewise, language translation. Furthermore, the world’s digital content is strongly biased in favour of these powerful languages, values and interests. Consider Wikipedia, for example, where content in English outweighs that in Arabic by about ten-to-one, and content on Middle-earth outweighs that on most of Africa. Search engines are common tools for every kind of learner, but again, the research literature highlights the bias in favour of specific languages, cultures and ideas. Neologisms from (American) English, especially for new products and technologies, are often absorbed into other languages without change.

On mobiles, the origins of textspeak from corporations targeting global markets, technically using ASCII (American Standard Code for Information Interchange), meant different language communities were forced to adapt. For example, using pinyin letters rather than Chinese characters or inventing Arabish to represent the shape of Arabic words using Latin characters. 

In reference to educational technology, we have to ask about the extent to which these embody and reinforce, specifically European, ideas about teaching, learning, studying, progress, assessment, cheating, courses and even learning analytics and library usage. Additionally, if you look at the educational theories that underpin educational technologies and then the theorists who produced them, you see only white male European faces.  

The Intersection of Technology and Subjects

There is, however, the extra complication of the intersection of what we use for teaching, the technology, and what gets taught from the different topics to subjects. The subjects are also being subjected to scrutiny. This includes checking reading lists for balance and representation, refocusing history and geography, recognising marginalised scientists and engineers and the critical positioning of language learning. Language education, in particular, must navigate between the global dominance and utility of American English and the need to preserve and support mother tongues, dialects and patois, which are vital parts of the preservation of intangible cultural heritage. 

The Ethical Challenges of AI

The sudden emergence of AI into educational technology is our best chance and worst fears. It is accepted that GenAI recycles the world’s digital resources, meaning the world’s misunderstandings, its misinformation, its prejudices and its biases, meaning in this case, its colonialistic mindsets, its colonising attitudes and its prejudices about cultures, languages, ethnicities, communities and peoples, about which is superior and which is inferior. 

To prevent or pre-empt the ‘harms’ associated with AI-driven content, Avallain’s new Ethics Filter Feature minimises the risk of generating biased, harmful, or unethical content. Aligned with Avallain Intelligence, our broader AI strategy, this control offers an additional safeguard that reduces problematic responses, ensuring more reliable and responsible outcomes. The Ethics Filter debuted in TeacherMatic and will soon be made available for activation across Avallain’s full suite of GenAI solutions.

How Should the EdTech Industry Respond?

Practically speaking, we must recognise that the manifestations of colonialism are neither monolithic nor undifferentiated; some of these we can change, while others we cannot.

For all of them, we can raise awareness and criticality to help developers, technologists, educators, teachers and learners make judicious choices and safe decisions. To recognise their own possible unconscious bias and unthinking acceptance, and to share their concerns.

We can recognise the diversity of the people we work with, inside and outside our organisations, and seek and respect their cultures and values in what we develop and deploy. We can audit what we use and find or produce alternatives. We can build safeguards and standards.

We can select, reject, transform or mitigate many different manifestations of colonialism as we encounter them and explain to clients and users that this is a positive process, enriching everyone’s experiences of digital learning.


1Traxler, J. & Jandrić, P. (2025) Decolonising Educational Technology in Peters, M. A., Green, B. J., Kamenarac, O., Jandrić, P., & Besley, T. (Eds.). (2025a). The Geopolitics of Postdigital Educational Development. Cham: Springer.

2Reparations refers to calls from countries, for example in the Caribbean, for their colonisers (countries, companies, monarchies, churches, cities, families) to redress the economic and financial damage caused by chattel slavery.

3Repatriation refers to returning cultural artifacts to their countries of origin, for example the Benin Bronzes, the Rosetta Stone and ‘Elgin’ Marbles currently in the British Museum.


About Avallain

At Avallain, we are on a mission to reshape the future of education through technology. We create customisable digital education solutions that empower educators and engage learners around the world. With a focus on accessibility and user-centred design, powered by AI and cutting-edge technology, we strive to make education engaging, effective and inclusive.

Find out more at avallain.com

About TeacherMatic

TeacherMatic, a part of the Avallain Group since 2024, is a ready-to-go AI toolkit for teachers that saves hours of lesson preparation by using scores of AI generators to create flexible lesson plans, worksheets, quizzes and more.

Find out more at teachermatic.com

Contact:

Daniel Seuling

VP Client Relations & Marketing

dseuling@avallain.com