The Swerve: K-12’s Age of Transformation
October 22, 2024 BlogOne of the most memorable books I taught as a World History teacher was Stephen Greenblatt’s The Swerve….
Since spring 2023, we at Tyton have been measuring how higher education has been grappling with the rise of generative AI tools in teaching and learning. In Time for Class 2023, our annual report on digital teaching and learning, we wrote about instructors’ deep concerns regarding academic integrity as student use of generative AI in their courses ramped up. Our latest installment, Time for Class 2024, delves deeper into the impact of generative AI tools in the classroom. This year, our survey of about 1,600 students, 1,800 instructors, and 300 administrators revealed the following results regarding generative AI in higher education:
As students become regular and sophisticated users of generative AI tools, institutions have more to do to keep up. Read the full 2024 report here to learn more about the current state of generative AI in higher education and our recommendations for the digital future.
Over the past year, all major stakeholder groups in higher education—students, instructors, and administrators—have significantly increased their use and awareness of generative AI tools (see chart below). Students remain at the forefront of this trend with 59% regularly (at least monthly) adopting generative AI, compared to approximately 40% of instructors and administrators. The increased adoption of generative AI in teaching and learning confirms our Time for Class 2023 predictions that generative AI is here to stay.
With the advent of publicly available generative AI platforms, the number of digital academic tools that either rely entirely on or are enhanced by AI has ballooned. Despite this, most respondents to our survey who are regular generative AI users use generalist, freemium solutions such as ChatGPT, and the adoption of specialized tools for teaching and learning remains limited. However, almost half of regular student users report opting for paid AI solutions (see chart below)—and when they do pay, it is for tools that offer specialized academic functions, suggesting more sophisticated tool use as familiarity and dependence on these technologies grow. Adult learners and learners taking courses entirely online, in particular, are more likely to be paying for generative AI tools. Instructors, on the other hand, are far less likely than students to be using paid, specialist solutions, suggesting that the academic generative AI tool market may be meeting student needs more than instructors’.
In this year’s survey, 50% of students say they are likely to continue using generative AI tools for schoolwork even if their institution or instructor banned them, a 21-point increase from Spring 2023. Therefore, like last year, academic integrity remains a top concern for instructors facing student use of generative AI tools in their courses.
This concern and the need to prevent cheating explains the 34% of instructors who report that the availability of generative AI tools has, increased their overall workload rather than decreasing it through streamlining the generation of course content and assessment as many tools promise. Instructors who report this increase say they are spending more time monitoring academic integrity and enforcing policies and/or redesigning assessments to counter AI usage (see chart below).
It is worth noting that instructors who use AI tools are less concerned about misuse: 77% of instructor non-users expect the rise of generative AI tools to create new challenges to identifying plagiarism compared to 60% of users. Similarly, instructors who use generative AI tools are more likely to see the potential for these tools to improve student learning.
From the student perspective, 45% of regular (at least monthly) generative AI users report an increase in academic workload compared to 22% who report a decrease. This uptick in workload is likely due to time spent working to utilize generative AI tools to better understand concepts and making sure they’re not violating any policies related to academic integrity.
As it stands, this increase in workload for both students and instructors underscores the dual-edged nature of generative AI tools: they offer significant benefits but also introduce complexities that institutions, instructors, and students must navigate—and quickly.
However, institutional policies are still in the early stages of implementation. Our survey shows that 76% of administrators indicate their institutions have not fully developed or implemented institution-wide policies regarding the use of AI tools. Although 37% are actively working on policies, only 24% have them in place already, over a year since generative AI first became a concern for instructors.
Simultaneously, sentiments regarding generative AI tools are becoming increasingly positive. Compared to spring 2023, significantly more administrators and instructors report believing students will need to know how to use generative AI tools for future jobs/careers, and that the responsibility to teach this lies with the institution (see chart below).
Therefore, we urge institutions to match their policies and practices to these shifting sentiments and work to offer support for instructors and students navigating the use of generative AI in the classroom. We know this would require institutions to put up fiscal resources and technical infrastructure that may not yet be available. However, providers have already begun this process: OpenAI recently announced ChatGPT Edu, an affordable, more secure version built for universities. So, as institutions move forward and consider how to adapt to generative AI tools at the enterprise scale, we hope this year’s research in Time for Class 2024 can help connect new investments and ongoing operational costs to institutional mission, improved student learning outcomes, and other urgent strategic imperatives.