AI Tutors: On Spock, Big Hero 6, and Learning with AI
- Peter Nilsson

- Jan 17
- 4 min read
In the 2009 Star Trek reboot, a young Spock is shown in a Vulcan school. It is a dark, cavernous room with learning pods like giant bowls carved into the floor, a pod for each student, each pod made of a ring of screens, each screen brimming with information. Adults pace the floor above. AI voices in the pods pepper students with questions and blanket the screens with dazzling graphs and equations, and the students answer, demonstrating their mastery. It is a vision of AI tutors engaging in personalized conversations with young students, and an elementary school-aged Spock answers a final question we don’t fully hear with apparent profundity: “...when it is morally praiseworthy, but not morally obligatory” (Abrams, 2009).
“This is the vision!” said an edtech investor to one of the authors.

Still from Star Trek (2009)
No.
In fairness, the prospect of an all-knowing, easily conversant AI tutor able to conjure images and video in a deeply interactive, personalized experience is a powerful vision. But a future in which elementary school students are isolated from each other, treated as purely cognitive beings and grilled on abstract academic facts, is a disaster.
It’s easy to overlook what happens next in the scene. When the lesson finishes, young Spock is bullied by three other Vulcan children in coldly rational language. He retaliates by pushing one of the bullies into one of the learning pods and beating him with abandon. It’s the only social experience in the short scene, and it’s a pack of bullies taunting Spock because he is half-human. They mock him for having emotions.
The vision of learning pods is novel, flashy, and seemingly intellectually engaging, but the director’s choices – a dark and cavernous room, adults pacing like prison guards, cold mockery of the protagonist for his humanity, and violence in the pod itself – show that this is a dystopia. At least for humans. The novelty is enchanting, but the nuance is disturbing.
Disney explores a different extreme:
In the animated feature Big Hero 6, the main character has an AI companion robot named Baymax, whose sole charge is to care for the main character, named Hiro. In caring for Hiro, Baymax acknowledges the tragedies in Hiro’s life and tends to his emotional and physical state, coaching Hiro against recklessness and towards more responsible actions. Baymax takes an embodied form, and his focus is Hiro’s well-being. This is a vision of a support machine that focuses on health. In the context of a larger storyline, Baymax guides Hiro and his friends back to school, where they learn and continue to seek mentorship from professors. Baymax doesn’t teach Hiro, but helps set the emotional conditions for Hiro to learn (Hall & Williams, 2014).

The healthcare robot Baymax, from Big Hero 6 (2014)
Still, despite Baymax’s good intentions, Hiro resists Baymax’s coaching, makes bad decisions, and even turns Baymax into a weapon. While Baymax does aim to act as a positive influence, it takes Hiro learning from his own experiences to draw the most important lessons the movie seeks to convey.
One vision is purely rational and intellectual, and the other is emotional and physical. The writers of each make it clear that neither is fully successful: the growth arc in both cases requires learning from experience.
But what if we could put these together? Would we have achieved the vision of an AI tutor? Would this replace teachers? Is that the vision?
These movies are markers in the sand. Drawing the best from each conjures a tantalizing and powerful vision that AI companies today are seeking to realize. Today’s technology is not only easily conversant like the screen-based AI in Star Trek, but also increasingly sensitive to our emotional state and on the paths towards becoming embodied robots like Baymax in Big Hero 6. Still, even fully realized, this vision cannot replace what teachers provide.
Learning and life do not happen in isolation. We learn socially from our peers. Our mastery of concepts and skills takes place within the context of other people in school, work, and family environments. The meaning that we derive from learning ultimately comes from how it changes who we are and what we do in the context of other people. This is why learning with others is so important. We may develop personal passions, interests, and ideas learning on our own, but those passions, interests, and ideas become significant when we share them in a classroom and see how others respond to them, when we hear others’ ideas about the same topics, or when we discover that what we thought we knew must be updated.
So even if AI tutors were sensitive to the multiple layers of our humanity, our inherently social nature requires that we learn with other people.
This is not to say that AI tutors are futile pursuits. Rather, despite these shortcomings, AI tutors are poised to advance and evolve substantial portions of what and how we and our students learn. The questions are how best to design them, and how best to make use of them. This is a topic we wrestle with in greater depth in chapter 4 of Irreplaceable: How AI Changes Everything (and Nothing) in Teaching & Learning.
Visions of teaching machines have been around for a long time (Watters, 2021). They are an alluring idea, but humanity’s record at creating them is riddled with failures. Nonetheless, today’s technology brings us closer to that vision than we have ever been before. In chapter 4, we look at some of the history of these efforts, some of the roadblocks they have faced, how today’s technology is engaging these roadblocks, and the implications of these new developments for learning.
Irreplaceable: How AI Changes Everything (and Nothing) in Teaching and Learning on Amazon and from Solution Tree
References:
Abrams, J. J. (Director). (2009). Star trek [Film]. Paramount Pictures; Spyglass Entertainment; Bad Robot.
Hall, D., & Williams, C. (Directors). (2014). Big hero 6 [Film]. Walt Disney Animation Studios.
Watters, A. (2021). Teaching machines: The history of personalized learning. The MIT Press.






Comments