Josh Fruhlinger
Contributing Writer

My robot teacher: The challenge of AI in computer science education

feature
Jan 6, 202511 mins

Can generative AI accelerate CS learning, or will it short-circuit the creative problem-solving skills future developers need?

Cute little robot toy handyman mechanic with hand wrench and light bulb standing on a white floor with orange background. Fix repair maintenance concept.
Credit: Besjunior / Shutterstock

Over the past two years, generative AI has been a force for transformationโ€”and disruptionโ€”everywhere itโ€™s landed. Education was no exception; if anything, schools were among the first institutions to grapple with AIโ€™s implications. As students embraced ChatGPT and similar AI technologies for research, test preparation, and academic writing assistance, among other things, educators found themselves at the forefront of a sweeping societal changeโ€”and a growing ethical dilemma: Should AI-assisted learning be accepted as a new normal in education, or are students cheating themselves by not learning basic skills?

If anything, these debates are more acute in computer science education than elsewhere. Professional developers are among the most enthusiastic early adopters of generative AI tools. To ask learnersโ€”whether theyโ€™re in high school, college, or taking a professional courseโ€”to go without AI assistance seems almost as quaint as making them use punch cards to input their test responses. But thereโ€™s still a reasonable question: How can we ensure students develop the foundational knowledge to be able to understand and evaluate AI suggestions? We spoke to professionals with a foot in computer science education to find out how AI tools are transforming the learning process.

Fundamentals matter more than ever

Seth Geftic, VP of product marketing at the cybersecurity firm Huntress, is heavily involved with mentorship in computer science and cybersecurity. To him and nearly everyone we spoke to, one of the biggest risks of AI in a learning environment is that it can help students bypass the knotty problem-solving exercises crucial to their education. โ€œAI in the learning experience makes it extremely easy to seek help as soon as you come up against something you find difficult or strange,โ€ he says. Ideally, in a learning environment, โ€œyouโ€™ll need to improvise, think outside of the box, and find unusual (and productive) ways that your building blocks of knowledge can interact and help you solve a question. These harder segments where you have to think for yourself are the real teaching moments in a computer science course and are the moments that will set apart students that are okay from students that are fantastic.โ€

โ€œWhen AI is right there, it becomes extremely easy to turn to the machine if you come up against these moments that require additional thinking,โ€ he adds. โ€œWhen people rely too much on artificial intelligence, I worry that there becomes less and less moments that make you develop that creative muscle. Iโ€™m always going to champion teaching people to fish, rather than teaching them how to interact with an AI that can fish.โ€

Michael Wilson, COO of GenTech, a community tech hub with school- and community-based STEM programs, says itโ€™s largely a matter of how AI is used. โ€œFor students, AI makes searching for an answer easier, but when used as a forum rather than a search engine, it becomes harmful,โ€ he says. โ€œAsking โ€˜Write me a program that prints hello world constantlyโ€™ is different from โ€˜How do I print to the screen?โ€™ followed by โ€˜How do I repeat a section of code forever?'โ€ Iterative prompts like these can lay the groundwork for fundamentals.

At the university level, students are expected to understand the importance of learning fundamental concepts and not taking shortcuts. Dr. Tirath Ramdas is the founder of the education-focused GenAI company Chamomile.ai, and also teaches a software development undergraduate course at a major university in Melbourne, Australia. In both roles, he has a front-row seat to how generative AI impacts software development teaching and learning. โ€œThe universityโ€™s policy is for students to use any tools available as a learning aid but to take responsibility for their own learning, to ensure they really understand the material rather than rely on code generation,โ€ he says. โ€œOverall, I think university students deserve some credit for their attitude towards generative AI. A survey by the Harvard University Undergraduate Association found that the number one reason students abstain from generative AI use is to avoid becoming over-reliant on it, so their instincts are right.โ€

AI guardrails in the classroom

At a fundamental level, instructors need to know students are completing assignments in ways that help them improve their skills without cutting corners. The experts we spoke with described various approaches to this problem. Some allowed the use of generative AI while others restricted it.

Open book, closed prompts

At Dr. Ramdasโ€™s Australian university, most exams are open-book and allow almost unlimited Internet access. โ€œInterestingly,โ€ he says, โ€œthese exams have always had a policy of barring communications apps so that students couldnโ€™t communicate with others to do their work. This policy has now been extended to include ChatGPT-like systems, and invigilators are instructed to look for cases of cheating with such systems. It may be detected when students are seen writing paragraphs of text as one does when prompting an LLM, but is not the norm for normal coding.โ€

Keep it focused

Elmer Morales is the founder and CEO at koderAI, and is training his daughter and other software engineers who are learning to code. โ€œIโ€™ve seen professors ask students to ensure their code only uses topics theyโ€™ve learned in class, which is something the AI wonโ€™t necessarily know,โ€ he says. โ€œThis is not entirely bulletproof, but it does force the student to review the code before using it, which is a form of learning and helps noobs improve their coding skills.โ€

Talk it out

โ€œOne lever I use is โ€˜interactive grading,โ€™ in which students need to explain their solutions to me,โ€ says Greg Benson, a professor of computer science at the University of San Francisco and chief scientist at SnapLogic. โ€œAt least in my courses, because they are more advanced, it is often easy to detect machine-generated code or to observe if a student doesnโ€™t understand a solution they have submitted through dialog.โ€

Huntressโ€™s Geftic points out that this sort of process isnโ€™t just busywork meant to stop students from cheating: Explaining the code youโ€™ve written (with or without AI help) is an important part of a developerโ€™s professional skill set. โ€œConversational testing,โ€ as he calls it, โ€œgoes hand-in-hand with the building of secondary communication soft skills, which are becoming more and more sought-after in the IT space.โ€

Go hands-on

Maksym Lushpenko is the founder and CEO of Brokee, which challenges students with devops labs consisting of broken systems they need to fix. โ€œOur labs are designed so that you canโ€™t just copy the description and have AI do all the work for you,โ€ he says. โ€œImportant details about whatโ€™s broken are hidden in the test environment, so students need to exploreโ€”run commands, check logs, and figure out whatโ€™s going on.โ€ He says that such environments allow students to use generative AI โ€œresponsiblyโ€ in a learning environment. โ€œAI can definitely help, like explaining how a system works or how to run a specific command, but at the end of the day, the student still has to put it all together and fix the problem.โ€

Appropriate reliance and responsible use

Everyone we spoke to agreed that the โ€œhorse is out of the barnโ€ when it comes to generative AI and computer programming: Youโ€™re not going to stop people from using it in their careers, so the trick is ensuring people know how to use it responsibly. As KinderLab Roboticsโ€™ director of curriculum, training, and product management, Jason Innes focuses on AI-assisted learning for young children. He told us he very much believes AI needs to be part of the curriculum from early education forward.

โ€œYoung children need to understand what AI is before they learn to use it. They need to learn that AI is not alive, has no goals of its own, and is not always rightโ€”and that it is a tool created by human engineers,โ€ he says. โ€œIf we want to prevent kids from taking shortcuts with LLMs, we need to help them develop an understanding of what AI is and what its limitations are. AI is a tool that can help people think and work better, but we still need to master the fundamental cognitive skills that we want AI to assist us with.โ€

This lesson isnโ€™t only for kindergartners. Danielle Supkis Cheek is VP, head of AI and analytics, at Caseware, and also a part-time faculty member at Rice Universityโ€™s Jones School of Business, where she teaches data analytics in the Masters of Accountancy program. In her view, itโ€™s crucial for students to understand AI as part of the landscape of tools and information in which theyโ€™ll be operating. โ€œItโ€™s not a realistic scenario for a student of mine to ever know every scenario and the pace of change of what is changing out there,โ€ she says. Instead, she wants her students to understand tools like AI, focusing on โ€œthe metacognition concept of how to learn, how to understand, and how to be skeptical of the responsesโ€”and how to then follow up and make sure you can take appropriate reliance. Thatโ€™s the skill set that Iโ€™m teaching.โ€

In Supkis Cheekโ€™s class, students are meant to learn the processes that real-world accounting professionals would follow, which might include AI. โ€œThe answer to me is not as important as the process by which the student got the answer,โ€ she says, โ€œand so I need them to use processes that are available to them in the real world so that they can use this in the university setting.โ€ The experience is also meant to introduce students to the rigorous world of corporate finance, and help them learn when certain processes and techniques are acceptable. โ€œThe course is a safe place to fail so that when they get to the real world, they are more appropriately equipped,โ€ she says. โ€œHere, the worst thing that happens to you is to get a bad gradeโ€”versus the worst thing that happens if an auditor misses something may be a deficient audit report that results in significant fines, lawsuits, and erosion of public trust.โ€

Asking the right questions

For many educators, AI isnโ€™t something to merely accommodate in their curricula: instead, theyโ€™re actively building their courses to teach their students how to use it effectively. After all, as Brokeeโ€™s Lushpenko puts it, โ€œTo get the most out of AI, you still need to know what questions to ask and how to apply the answers it gives you.โ€

โ€œOther kinds of training need to be developed with AI tools specifically in mind,โ€ says Risto Miikkulainen, AVP of evolutionary AI at Cognizant AI Labs and a professor of computer science at the University of Texas at Austin. โ€œSuch assignments may be larger than current ones, with overall design done by students, and detailed low-level implementation done by AI tools. They may include new assignments such as upgrading software, debugging, and repair, that are currently tedious but where AI tools can help significantly. They may also include designing software so that AI can be most effectively used in the future to upgrade and maintain it as well.โ€

Benson at the University of San Francisco is also thinking along these lines. โ€œI will be teaching our upper division operating systems course next semester in a way that allows students to take full advantage of coding assistants on projects,โ€ he says. โ€œI will be making the projects more complicated than my previous projects, and I will also guide the students on how to use assistants to both develop solutions and how to use them to learn OS concepts and code more deeply.โ€

AI is for educators, too

So far, weโ€™ve focused on student use of AI, but generative AI can also help educators. For instance, KoderAIโ€™s Morales points out that it can help answer simple student questions that might otherwise eat up class time. โ€œExperienced human software engineers (professors included) donโ€™t have time to help entry-level coders every time they get stuck or when they forget something like indenting a line of code in Python,โ€ he says. โ€œThis is where generative AI has been a game changer, allowing up-and-coming coders to simply share their code with AI and have the AI make suggestions or provide code that would help them continue their learning journey without having to wait for a human.โ€

Plus, as GenTechโ€™s Wilson says, it isnโ€™t just students who need a quick answer sometimes. โ€ For teachers, AI gives them the ability to find an answer or different view to an obscure question that could be asked by a student,โ€ he says. On the voyage of learning, teachers and students will be discovering how these new tools do (and donโ€™t) work together.

Josh Fruhlinger

Josh Fruhlinger is a writer and editor who has been covering technology since the first dot-com boom. His interests include cybersecurity, programming tools and techniques, internet and open source culture, and what causes tech projects to fail. He won a 2025 AZBEE Award for a feature article on refactoring AI code and his coverage of generative AI earned him a Jesse H. Neal Award in 2024. In 2015 he published The Enthusiast, a novel about what happens when online fan communities collide with corporate marketing schemes. He lives in Los Angeles.

More from this author