Generative AI: Cheating Tool or a Chance to Re-imagine Education

Author: Azra Naseem

Artificial Intelligence (AI) has become an integral part of our daily lives in the form of personal assistants like Siri, Alexa, and Google Assistant; recommender systems on platforms like Netflix and YouTube; facial recognition technology such as Face ID on

iPhones and Facebook’s photo tagging; and other applications like Google’s smart compose that assists in completing sentences and phrases as we type. AI has become ingrained in our routines and there is no turning back. AI in education has been focus of research for the past two decades exploring areas such as intelligent tutoring systems, personalized learning, data mining, learning analytics, and assessments, among others. But it was the public launch of ChatGPT in November 2022 that served as a wake-up call to the education community, demonstrating that AI is not only here to stay but also has the potential to disrupt traditional ways of teaching, learning, and working.

According to ChatGPT, “It is an AI-based conversational agent or chatbot that uses natural language processing (NLP) to generate human-like responses to text-based queries or prompts.” ChatGPT is an example of Generative AI built on large language models (LLMs) that use vast text-based datasets to make predictions based on a given prompt. DALL-E and Google Bard are other examples of Generative AI.

Current forms of Generative AI have inherent limitations. They are incapable of generating new ideas, engaging in thinking, learning, or experiencing emotions. Google’s Sundar Pichai recently mentioned the potential for emergent behaviour in AI, sparking discussions on whether these tools could attain sentience. However, the current reality is that these tools make predictions or guesses based on the datasets they have been trained on.

Consequently, Generative AI systems may produce nonsensical or incorrect responses when faced with questions that require common sense reasoning or when confronted with sarcasm, humour, and other forms of nuanced communication. The outcomes produced by Generative AI tools can also perpetuate existing biases or inaccuracies present in the training data. For example, my Kenyan colleagues noted that the images of African children created by DALL- E were odd and inaccurate. Therefore, it is essential to critically evaluate the outputs of Generative AI and develop digital literacy skills to navigate the AI-enhanced world. This critical lens should also be applied to non-AI online content that may be biased, inaccurate, or misleading.

My work in technology for education and development allows me to collaborate with higher education faculty, leaders, school educators, and administrators in Pakistan and other developing countries. In February 2023 I gave an EdTech Lounge talk aimed at exploring generative AI and its implications for higher education. Subsequently, I have engaged in conversations with both skeptics and supporters of AI, suggesting that we may be approaching a pivotal moment in the impact of AI in education. I will reflect on a few recurrent issues in the remaining part of this article.

AI in education has been focus of research for the past two decades exploring areas such as intelligent tutoring systems, personalized learning, data mining, learning analytics, and assessments, among others.

Recently, I attended a meeting where a school leader in Pakistan shared that they have incorporated ChatGPT into the educational process, encouraging students to use it to enhance their learning experience. However, they noted that teachers are lagging behind in adopting it, highlighting the gap between students’ and teachers’ use of technology. In Pakistan, teachers are often marginalized in terms of technology use both due to system inertia and teachers’ reluctance to learn and adopt new tools. The education system tends to maintain the status quo because of a variety of factors, including bureaucratic structures, entrenched interests, or a lack of resources and support for change. The expectation is often to adopt technology without disrupting the existing educational system. However, new technology brings new sets of opportunities and responsibilities, and it is not surprising for already overworked and underpaid teachers in Pakistan to shy away from taking on more responsibilities. As Khan Academy’s Khanmigo demonstrate, AI is disruptive, and for schools and teachers to remain relevant, both must evolve.

Among education colleagues in Pakistan, the conversations have been largely dominated by concerns about students using ChatGPT to cheat or complete their assignments. Generative AI tools can produce essays, codes or images based on the instructions given as prompts, raising concerns about academic integrity. Several leaders have proposed the use of AI detection tools to combat academic dishonesty. Although it is understandable to prioritize preventing academic dishonesty, it raises concerns about our priorities when we see the rapid adoption of Generative AI in educational institutions without sufficient consideration of its present limitations and the lack of preparedness among teachers and students in terms of digital literacy.

The lack of trust in our K-12 and Higher education systems’ ability to instil ethical and moral values in our students enabling them to make the right choices is particularly intriguing. While it is unsurprising that students quickly embrace new technologies like ChatGPT, it is concerning that we, educators and leaders, doubt our students’ ability to use them ethically. Our first instinct is to expect the worst and assume that students will use Generative AI to cheat. But in my experience, most students do not want to cheat if they are provided with supportive, non-judgmental, and safe learning environments. If we are concerned about cheating, what does that say about our education system? If we are failing to instil ethical values in our students, can we afford to continue doing things the current way instead of embracing new approaches afforded by Generative AI?

Generative AI tools are already being integrated with other commonly used tools, such as word processors and search engines. Microsoft’s copilots and AI-powered Bing or Google’s Bard are a few examples of a rapidly evolving landscape. As these tools seamlessly integrate into our thinking and writing processes, we must consider what constitutes creativity and originality. Some critics argue that the output generated by these tools is not truly original, as it relies on existing data. They argue that using pre-existing data to create new content constitutes plagiarism. However, others contend that humans also rely on recycled material when producing work, so it may be unfair to label Generative AI content as plagiarism. Overall, there are no clear-cut answers at the moment. What is important is to consider why would students turn to AI for cheating instead of learning, and how we might support and prepare them to be successful learners and citizens.

This is a moment for us to focus on the role Generative AI might play in enhancing the learning process. One initial step is to rethink our pedagogy and assessment. Given the ease with which LLMs can generate essays and reports, we could have students submit drafts for feedback, rather than solely focusing on the final version. We can help students explore the use of these tools for brainstorming or editing and acknowledge their use in the assignments. We can guide students in detecting bias and misconceptions by analyzing multiple outputs produced by Generative AI tools.

Finally, technology is a tool or a means to an end and its impact is ultimately determined by how we use it. Therefore, to ensure ethical and responsible use of technology, we must instil values in our education systems and encourage students to make responsible choices. Rather than dismissing Generative AI technologies as cheating tools or using them to justify increased surveillance and punitive measures, we should use them as tools to reimagine education to promote human-centeredness and critical digital literacy and foster ethical behaviour and learning.

The writer is Director Blended and Digital Learning and a faculty member at Aga Khan University Institute for Educational Development. She can be contacted at: azra.naseem@aku.edu.

Share
Leave a Comment

Recent Posts

  • World

Turkiye’s Erdogan calls for Islamic alliance against Israel

Turkish President Tayyip Erdogan said on Saturday Islamic countries should form an alliance against what…

5 hours ago
  • Pakistan

Gold extraction endangers rare reptiles

A rare snake species known as the blunt-nosed viper and other reptiles, especially the geico…

5 hours ago
  • Pakistan

Catering services in high demand as Milad (PBUH) celebrations intensify

As Pakistan prepares to celebrate the birthday of the Holy Prophet Muhammad (PBUH) on September…

5 hours ago
  • Pakistan

PCB official says domestic competitions not subservient to international assignments

PCB Director High-Performance, Tournament Director Champions One Day Cup Nadeem Khan has said that the…

5 hours ago
  • Pakistan

Experts suggests lifestyle changes to control diabetes

The Health experts addressing a symposium on Saturday stressed lifestyle changes to prevent diabetes which…

5 hours ago
  • Pakistan

Pakistan team to compete in 5th World Nomad Games 2024

Pakistan's combined contingent is all set to participate in the 5th World Nomad Games, scheduled…

5 hours ago