SOUTH AFRICA

AI literacy a critical component in 21st-century learning
In the ever-evolving landscape of higher education, a new imperative is gaining prominence: artificial intelligence (AI) literacy.As AI increasingly permeates academic and professional spheres, the explosion of scholarly discourse on AI literacy – evident from the surge in recent academic publications – underscores its critical role in equipping students to navigate an AI-driven world.
As we celebrate International Literacy Day on 8 September, we should ponder the following important questions: What precisely constitutes AI literacy? Why is it becoming increasingly crucial in the context of higher education? And, perhaps most pressingly, how can it be practically cultivated?
With a variety of AI tools and applications becoming increasingly prevalent and ubiquitous in academic and professional life, these questions have moved from the periphery to the centre of educational discourse. In this article, the concept of AI literacy, the significance of non-professional AI literacy in higher education, and practical approaches to its development are unpacked.
What is AI literacy?
To understand AI literacy, we must first revisit the concept of literacy itself. Traditionally associated with reading and writing, literacy has evolved to encompass a wide range of skills necessary for engaging with, and the understanding of diverse forms of communication, particularly in the digital age.
AI literacy can be seen as an extension of this concept. Although there is not a single agreed-upon definition of AI literacy, many use the one offered by American scholars Duri Long and Brian Magerko. They define it as “a set of competencies that enables individuals to critically evaluate AI technologies; communicate and collaborate effectively with AI; and use AI as a tool online, at home, and in the workplace”.
However, a clear distinction must be made between the AI literacy needed by professionals – those who develop and refine AI systems – and that required by non-professional users.
For professionals, AI literacy includes technical knowledge, such as programming, machine learning and data science. Thus, an understanding of AI’s inner workings is needed for these professionals to innovate and improve AI technology.
On the other hand, for non-professionals, AI literacy is about knowing how to interact with AI tools, with a focus on generative AI (GenAI) tools capable of creating text, video, audio, images, music, and so forth.
This implies understanding the implications of using these tools as well as being aware of the ethical, social and economic impacts they can have. This kind of literacy is increasingly vital as AI tools become more integrated into everyday life.
Numerous AI literacy frameworks exist, encompassing dimensions such as conceptual understanding, application, evaluation, collaboration, ethics, and autonomy.
These frameworks aim to equip individuals with the skills to navigate the complex landscape of AI, ensuring that users can evaluate critically and engage responsibly with AI technologies.
At a minimum, non-professional AI literacy should include a basic understanding of AI, effective use of AI tools and systems, and critical evaluation of AI outputs and AI use.
• A basic understanding of AI, with a focus on GenAI: This involves learning what AI and GenAI are, including their capabilities, limitations, and potential applications as well as where you might knowingly or unknowingly encounter them.
Students need to understand the basics of how these systems work to know how to interact with them, and how to use them effectively, safely, and responsibly. Tools like ChatGPT, ChatPDF and DALL-E are common examples of GenAI that students might interact with.
• Effective use of AI tools and systems: Students should be able to use AI tools to enhance their work, rather than relying on these tools to do the work for them. This means knowing when and how to use AI in a way that adds value without compromising the integrity of their efforts. This includes the ability to prompt and the use of appropriate data-protection measures.
• Critical evaluation of AI outputs and use: This is perhaps the most important component of AI literacy. Students must be equipped to critically assess the choice of tool to use, the nature of the interaction or collaboration, and the outputs of AI systems.
Why AI literacy matters
Increasing concerns about academic integrity are linked to AI literacy in higher education. As AI tools like ChatGPT become more prevalent, students who lack a proper understanding of these tools may inadvertently engage in academic misconduct, such as submitting AI-generated content without proper attribution or critical evaluation, ultimately undermining their own learning.
For instance, an AI-literate student might use a tool like ChatGPT to brainstorm ideas or get clarification on complex topics, but would understand the importance of critically evaluating the chatbot’s output and incorporating it into their work in a way that reflects their own understanding and effort.
This underscores the need for institutions to, not only set clear guidelines for AI use, but also to actively cultivate AI literacy as a core component of academic integrity education.
In our work at Stellenbosch University in South Africa, we suggest a values-driven, contextual approach, based on the following guiding principles:
• Authenticity: Whose work is it? Ensure that the final product is genuinely reflective of the student’s own understanding and effort.
• Transparency: Have you declared your use of AI? Students should be open about how AI was involved in their work.
• Accountability: Where does the responsibility lie? Ultimately, students are responsible for the content they submit, whether or not it was AI-assisted, and
• Fairness: Is your use of AI fair to all involved? Consider the ethical implications of AI use, especially in collaborative or competitive settings.
We cannot assume that students will know what constitutes allowable use in each context.
Integrity scholar Sarah Elaine Eaton suggests hybrid human-AI writing might soon become the norm, raising questions about the concept of originality. Higher education institutions, thus, cannot escape the responsibility of addressing at least non-professional AI literacy in different disciplinary contexts while preparing students for a world beyond the confines of academia.
Developing AI literacy
Given the importance of AI literacy, the question becomes: How can it be developed, particularly at universities?
One effective strategy is using AI detection tools, like Turnitin’s AI text reports, as teaching aids rather than punitive measures. By engaging with these reports formatively, students learn to recognise AI’s limitations and the importance of authenticity in their work.
Additionally, innovative exercises, such as critiquing AI-generated writing or using GenAI to test student-generated content questions, encourage critical thinking and evaluative judgment.
For instance, in crafting this article, I utilised AI assistants to help structure ideas and refine language. By engaging critically with these tools, I ensured that the final product authentically reflects my insights, exemplifying the balance between AI assistance and human creativity that AI literacy seeks to promote.
As AI continues to reshape our world, tertiary institutions must take proactive steps to integrate AI literacy into their curricula across all disciplines. This isn’t just about teaching students to use new tools; it’s about equipping them with the critical thinking skills needed to navigate an AI-driven future ethically and effectively.
Going forward, educators, administrators and policymakers must collaborate to develop comprehensive AI literacy programmes, update existing courses to include AI components, and create opportunities for hands-on learning with AI tools. By prioritising AI literacy, we can ensure that our students are not just prepared for the future, but are positioned to shape it responsibly and innovatively.
Dr Hanelie Adendorff is a senior adviser at the Centre for Teaching and Learning, or CTL, at Stellenbosch University, South Africa. Her colleague, Dalene Joubert, and Emma Bowers-Swart, an ad hoc assistant at the CTL, contributed to this article.