UNITED STATES
Academic use of AI for teaching is on the rise – New research
After a period of profound scepticism towards AI among American academics, new research has shown that three quarters have now “taken at least modest steps towards integrating generative AI into their teaching”. This should come as a relief to universities that have been investing heavily in AI use for instruction.“A majority (66%) of instructors indicated being at least somewhat familiar with generative AI tools, while only 16% reported having little to no familiarity,” says a report on a survey published on 20 June by not-for-profit research organisation Ithaka S+R, which has run a triennial national United States faculty survey for the past two decades.
“These are remarkable numbers given that OpenAI released ChatGPT just 13 months prior to the launch of our survey, underlining the speed at which generative AI is spreading. Instructors have clearly been paying attention,” the report states.
The breathtaking rate at which generative AI has swept through higher education, and has become a key strategic issue, has also been revealed by other surveys of AI use in teaching as well as in research.
AI has become a focus of research, in terms of using AI to conduct research and also in terms of researching the impacts of AI on higher education.
A recent survey by Oxford University Press showed that 76% of academic researchers around the world use some form of generative AI in their work, and a national survey of students in the United Kingdom reported in January that 53% have used generative AI to help with their studies.
Generative AI and Postsecondary Instructional Practices: Findings from a national survey of instructors is authored by Dr Dylan Ruediger, senior programme manager for the research enterprise at Ithaka S+R, and researcher Melissa Blankstein and analyst Sage Love in the libraries, scholarly communication and museums programme.
“Our survey provides compelling evidence that instructors are exploring instructional uses of generative AI in large numbers. It also highlights ongoing uncertainty about how best to use the technology and indicates that many instructors do not allow students to use generative AI tools,” the authors write.
Three primary findings
There are three primary findings.
First, while most instructors now have at least a passing familiarity with generative AI tools, “many, especially older instructors, are not confident in their abilities to use them for pedagogical purposes or in their value in educational contexts”.
Second, 72% of instructors have experimented with using generative AI as an instructional tool. “While instructors are using generative AI in many different ways, no individual use case has become particularly well established” the survey report says.
Third, most lecturers want some kind of support to help integrate AI into courses, but only a minority are looking for specific support – thus providing little direction for institutions that are trying to provide services.
Ruediger told University World News that one of the survey’s most interesting findings is that many faculty who are themselves experimenting with how to use AI as a teaching tool, do not allow their students to use it.
“We don’t have hard data about why 42% of instructors prohibit their students from using generative AI. But one plausible explanation is that many instructors want to become more familiar with how the tools work before allowing their students to do so.
“Only 18% of faculty were convinced that generative AI would have a positive effect on teaching and learning in their field, and there are privacy, equity, and ethical concerns about generative AI that many faculty are legitimately worried about,” Ruediger said.
“Our general impression is that instructors are curious but not yet convinced of the value of these tools, which could account for their caution about allowing or encouraging student use,” he added.
The backdrop
Following the November 2022 public release of ChatGPT, higher education institutions around the world found themselves in a new technological environment that challenged long-standing norms around academic integrity and raised questions about how learning happens.
Since then, there has been a flurry of activity around equipping students with AI skills and literacies, using AI technology appropriately to promote teaching and learning, establishing best practices, and developing generative AI platforms, among other things, says the report.
“Understanding how instructors are (or are not) using generative AI in their classrooms is vital because most college and university guidelines leave decision making about how, when, and if generative AI use is permitted to the discretion of individual instructors,” it states.
For this reason, Ithaka S+R included a four-question section specifically on generative AI as part of its national faculty survey, which was sent to 135,284 faculty members from a range of disciplines and institutions and conducted from 7 February to 10 March 2024. There were 5,259 responses.
Within the larger survey, respondents were randomly assigned one of two extra sets of questions representing “topical deep dives” – on generative AI, or on academic freedom and censorship.
There were 2,654 responses received for the AI deep dive, and its findings have been published first. White respondents, and people of 45 years and older, both comprise 75% of the response pool; 51% are women; 61% are tenured or tenure-track; 54% work at doctoral universities; and 44% are in the social sciences, 29% in humanities, 24% in sciences.
AI familiarity but not understanding
One of the most important findings is that while faculty are open to integrating AI into teaching, “they don’t yet know how to do it effectively and are not yet convinced that generative AI tools can have a positive impact on student learning”, Ruediger told University World News.
“There is a real need for universities to provide greater support in these areas and for academic communities to articulate compelling use cases if they want more faculty to adopt AI-informed pedagogies,” he explained.
Only 18% of respondents agreed that they understand the teaching applications of generative AI, and only 14% agreed that they feel confident in their ability to use AI in teaching. This despite a proliferation of tips, guidelines, workshops etc on instructional uses of AI.
The survey also found high levels of uncertainty about whether generative AI’s net effects will be positive or negative, with 56% on the fence while 19% of lecturers felt AI would benefit teaching in their fields and 25% did not believe AI’s impact would be positive.
Scepticism of AI’s value is strongest in the humanities, with 45% of faculty not believing that AI’s impact on their field will be positive. One comment was: “Philosophy is a discipline that teaches thinking as a practice, and the use of AI to do thinking for you destroys this practice.”
The survey found that adoption of AI is at least partly a generational issue. For instance, while 53% of 22- to 34-year-olds are familiar with generative AI, only 26% of people 65 or older are, and the picture is similar regarding understanding AI and confidence in using it. However: “Faculty scepticism and uncertainty are cross-generational concerns.”
How instructors use AI
While 72% of academics reported having used AI for at least one instructional activity, AI is being used in a variety of ways, with no individual use becoming well established. The most common uses were designing course materials (22%), helping with email or other administrative tasks (16%), and creating images or visualisations (15%).
Interestingly, write Ruediger, Blankstein and Love, academics who have not engaged with AI in teaching are most often at doctoral institutions (where a high 58% have not engaged) and are most often in the social sciences: 41% social scientists have not engaged with any generative AI for teaching, followed by 30% of humanists and 27% of scientists.
Support for integrating AI into courses
Academics are interested in building AI skills, especially in three areas: creating tutorials and study guides (which 38% rated very or extremely valuable); creating images or visualisations for classroom use (36%); and using AI to design syllabi, assignments and other materials (36%).
Around half of instructors see some value in instructional support for engaging with AI: Thus, the report says: “Universities that build out services to support a range of AI-informed instructional uses will have a meaningful audience.”
It was clear from the survey that many academics who have experimented with generative AI for instructional purposes, do not allow students to do the same for their studies: 42% in all.
Among the academics who do allow students to use AI, its most likely permitted use is as a brainstorming tool: 37% of instructors allow this, followed by outlining (23%), drafting or revising written assignments (23%), and as a study guide (21%).
“It is noteworthy that the most commonly encouraged [and-or] allowed use cases were those related to writing, a finding that is somewhat surprising given that much of the early controversy around generative AI focused specifically on its abilities to enable plagiarism.
“One likely explanation is that writing occupies a unique role as a sort of default assignment modality in a wide number of academic disciplines,” the report states.
Concluding thoughts
Ruediger, Blankstein and Love point out that the Ithaka S+R survey is one of several “that indicate high levels of uncertainty and deep pockets of pessimism” about whether the impacts of generative AI on teaching and learning will be positive or detrimental.
“Serious concerns about academic integrity, ethics, accessibility and educational effectiveness are contributing to this uncertainty and hostility,” they explain.
Universities and colleges are allocating significant resources to generative AI, concludes the report. While administrators have largely moved away from prohibition-based approaches to AI, this is clearly not the case for many academics.
“Whether those steps portend more openness to generative AI or confirm existing scepticism will play an important role in determining how far faculty will go on the road on which administrators hope to lead them,” the report states.