Must Read Blog June 27, 2023

Investing in AI Edtech: Cutting Through the Hype

Last week, we published our annual Time for Class study, which focuses on the state of digital learning and AI in higher education. The study examines how faculty and institutional leaders are using digital learning tools in teaching and learning to implement practices that can improve student outcomes. This year’s edition also gathered the student perspective, revealing important disconnects between educators and learners. These surveys took the pulse of higher education regarding the rise of artificial intelligence tools like ChatGPT. This month, we aim to translate that perspective into considerations when evaluating investment opportunities.

The ongoing wave of hype around artificial intelligence has given rise to cloudy narratives regarding both the upside and downside of this technology’s impact on education. While AI technologies could transform many aspects of education, administrators and instructors – particularly in higher education institutions – are facing clear, immediate-term challenges.

      Academic integrity quickly becomes a top concern 

      One early area of concern exists around issues of academic integrity. Institutions have historically addressed issues of academic integrity through adoption of anti-plagiarism tools and assessment security solutions. However, since ChatGPT’s release in November, “preventing student cheating” has become the top instructional challenge for faculty, remarkably jumping up from the tenth-most challenging issue in a single year.

      Time for Class 2023 Top 10 Challenges for Instructors in Higher Education chart shows that preventing students from cheating was a a low priority in 2022, while it's become a top priority in 2023

      These concerns are in part justified: students are using generative AI tools at much faster rates than administrators, instructors – or their curricula and assessments – can keep up with. In March 2023, roughly 100 days after the launch of ChatGPT, 27% of higher education students reported being regular users of generative AI tools, compared to only 9% of instructors and 8% of administrators. Moreover, institutional policy responses have been slow, with only 3% of institutions reporting having a fully implemented AI policy during the same period. Whether or not policies will be effective once in place is another thorny question; survey data indicates that students would continue to use AI tools regardless of institutional or course-level guidelines or restrictions, suggesting that while policies are necessary, they are not sufficient.

      Investors should look beyond just academic-integrity uses

      While the academic integrity discussion brought on by generative AI has taken center stage, other important areas of concern remain that administrators expect will garner increased funding. From a student success perspective, administrators’ top areas where they anticipate increased funding devoted to including:

      • Better serving and supporting students from a variety of backgrounds;
      • Supporting students’ mental health and wellness needs, and
      • Initiatives to support students’ mental health and wellness
      • Initiatives to support students with financial need

      Investors should ensure that any company they are investing in using AI is using the technology to solve an institutional problem where dollars are flowing to, as opposed to solving a lower-priority challenge because the technology is capable and flashy.

      In the near term, AI appears to be more often a product capability than a business model

      Since the launch of OpenAI’s APIs, small and large companies alike have begun adding AI large-language model capabilities to their existing product suite – Khan Academy’s Khanmigo and Chegg’s CheggMate are simply two examples – putting these functionalities into many hands quickly. Despite the institutional education sector’s tendency to slow adoption of innovation, these inexpensive, easy-to-use APIs could lead to many more providers experimenting and touting new functionalities. With that, it will be critical that innovators experiment safely – with efficacy and student and instructor experience top-of-mind – as these base models are generally not tuned for safe educational use out of the box.

      Further, investors should stay disciplined in their pursuit of sustainably differentiated offerings, as it remains unclear if this type of API plugin is enough to create fundamentally new business models, or simply a new product feature.

      As Reach Capital noted, “Simply building an API to GPT-3 or another out-of-the-box model, and adding some UX wrapper around it, is not a strong differentiator.” Companies like Khan Academy and Chegg have vast amounts of vetted, proprietary data that allow them to “tune” base models to improve their safety and efficacy in educational contexts. Further, large players have an advantage simply due to scale, as the tuning process requires rapid iteration on customer test data.

      Investing through the noise: criteria to consider when evaluating opportunities

      While investors should be cautious of getting caught up in the AI “noise,” the promise and potential regarding this technology justifies exploration. Tyton recommends taking a principled, fact-based approach to evaluating opportunities in this space. More specifically, look for companies that …

      • … use AI to solve a high-priority, real-world challenge – i.e., avoid disconnected techno-solutionism
      • … have buyers who understand the value of the solution, and are willing to spend
      • … have access to proprietary, vetted, structured historical data to fine tune the base-model in a safe and unique way
      • … have the scale to rapidly iterate on a volume of user-feedback upon release

      Tyton is closely monitoring as generative AI continues to make waves across education. Be sure to check out our recent piece that highlights K-12 educators’ perspective on AI in the classroom.

      As always, we welcome the opportunity to continue this discussion.