Investigating the Impact of Codio Coach: A Specialized AI Learning Assistant on Computing Student Engagement and Performance

Mohit Chandarana
30 Jun 2025
•
2 min read
Sindhu Ramachandra

Joshua Ball

Maura Lyons
Phillip Snalune

Abstract
Recent research has demonstrated significant advancements in the applications of Large Language Models (LLMs) in educational environments, particularly in delivering immediate, personalized student feedback. This study examines the impact of Codio Coach, a specialized AI learning assistant integrated into the Codio platform, on student engagement and performance in asynchronous MOOC-style computer science courses.. It utilizes Large Language Models
(LLMs) to provide support without supplying direct answers. It consists of three modules: Summarizer, which simplifies assignment instructions; Error Explanation, which clarifies programming error messages; and Hints, which provides Socratic-style hints by posing questions or suggestions to guide students toward solutions.
Analysis revealed an immediate and sustained uptake in assistant usage, with "Explain this error" being the most frequent interaction (56.3%), confirming engagement and highlighting student need for error comprehension support. Assignments where Coach was enabled showed improved student performance, with a 12% increase in Mean Grade and a 15% increase in Median Grade. Furthermore, an impressively low error event rate (0.12%) observed in these AI-assisted courses suggests early signs that such tools may contribute to more effective programming environments.
These findings provide valuable evidence for the efficacy of tailored AI learning assistants in enhancing student engagement and performance in CS education. We recommend educators guide students in leveraging custom, context-specific assistants to improve learning and develop critical AI application skills.
(LLMs) to provide support without supplying direct answers. It consists of three modules: Summarizer, which simplifies assignment instructions; Error Explanation, which clarifies programming error messages; and Hints, which provides Socratic-style hints by posing questions or suggestions to guide students toward solutions.
Analysis revealed an immediate and sustained uptake in assistant usage, with "Explain this error" being the most frequent interaction (56.3%), confirming engagement and highlighting student need for error comprehension support. Assignments where Coach was enabled showed improved student performance, with a 12% increase in Mean Grade and a 15% increase in Median Grade. Furthermore, an impressively low error event rate (0.12%) observed in these AI-assisted courses suggests early signs that such tools may contribute to more effective programming environments.
These findings provide valuable evidence for the efficacy of tailored AI learning assistants in enhancing student engagement and performance in CS education. We recommend educators guide students in leveraging custom, context-specific assistants to improve learning and develop critical AI application skills.
Download the full paper
Related Articles

Research paper
Bridging the Data Skills Gap
We used insights from data science and analytics executives to align education and training programs with industry needs and expectations.

Research paper
Breaking the Textbook Paradigm: Increasing Access by Removing...
Exploring how interactive, hands-on learning environments enhance comprehension and accessibility by reducing text-heavy educational materials.

Research paper
Bridging the Cybersecurity Skills Gap: Aligning Educational Programs with Industry Needs
We surveyed cybersecurity industry professionals to understand which skills and educational experiences are most highly valued in new hires.
[Drive Innovation] in Your Learning Environment