We're Forgetting How To Fly
What autopilot systems can teach us about the dangers of relying on AI
Thank you for being here. As always, these essays are free and publicly available without a paywall. If my writing has been valuable to you, please consider sharing it with a friend. If you feel so inclined and wish to support me materially, please consider signing up for a paid subscription by clicking the button below.
Technology promises us more autonomy. The reality is often much more complicated.
As we grapple with how to help our students navigate a world where they will be constantly bombarded with decisions about when and where to outsource their work to AI, we will need to adapt the knowledge and skills we teach them. Even if we don't change what we teach them, we'll need to continue to dialogue with our students about why it's important. Even more importantly, in addition to the skills and knowledge we want them to build, we will need to focus much more intentionally on helping them cultivate the wisdom to know how to engage in the midst of a changing technological landscape where the long-term consequences of working with a mechanical co-pilot.
Luckily, these questions are not entirely new, at least if you know where to look. There are fields where automation has already become deeply integrated. One of those fields is aviation. While we talk casually about AI being a co-pilot for our work, pilots are much more familiar with the opportunities and dangers of relinquishing control to an autonomous system.
The challenges embedded in the AI co-pilot model
I've still been ruminating about what my experience with Tesla full self driving last week can teach us about AI. In many ways, applications of AI in areas like autonomous vehicles highlight the underlying ethical questions more clearly than other applications, like using generative AI tools to help you write. Questions like:
Who bears the ethical responsibility for the safe operation of this technology?
How should an autonomous technology be judged to be reliable or safe for general use?
What constitutes responsible oversight or management?
What training and/or regulation should exist to protect the safety of others?
While these questions might seem unique to the context of autonomous navigation systems, whether in aircraft or vehicles, they have broader applications as we think about outsourcing tasks to AI in the domain of knowledge work.
The impacts of airplane autopilot systems on pilots' flying abilities are mixed. On the one hand, the autopilot can help reduce the fatigue and workload of pilots. This is especially valuable during longer flights. Automated systems can also often perform routine tasks with higher accuracy and precision than humans.
However, these advantages are paired with significant drawbacks. For one, reliance on autopilot systems has been shown to lead to atrophy in manual flying skills. Use it or lose it, as they say. It's not hard to imagine that this same phenomenon would also apply to the tasks we outsource to generative AI. If I'm using AI to write or code for me, it's not hard to imagine that after some time I will end up unable to code or write without it.
Another challenge of co-piloting is that the moment where you need to intervene is almost by definition a complex and challenging scenario. Whether or not the autonomous system automatically senses and flags its need for human assistance, the situations in which you need to grab the yoke are consequential. This bias paired with rusty skills from a lack of practice is a bad combination.
Even worse, autonomous systems can battle with a human pilot for control. This situation was well-documented in the case of the Boeing 737-MAX crashes where the poorly designed Maneuvering Characteristics Augmentation System installed on the planes repeatedly took control of the plane away from the pilots and forced the nose down. The pilots in these crashes were tragically unable to wrestle control away from the autopilot system to correct for the error.
Lessons from the cockpit
As we consider how to help our students think wisely about AI, these examples from the airline industry provide some helpful parallels. Here are some questions I’m asking myself and encouraging my students to ask as well:
What skills are impacted by your use of a particular technology?
What are the situations in which you need to take control and perform the task yourself?
How has your relative lack of practice on routine tasks impacted your ability to perform in these situations that go beyond the capability of the autonomous system?
How are you planning to monitor and prevent your skills from decaying?
On the one hand, AI can enable us to extend our creative capacity in many ways. You needn’t look much further than the way AI-powered tools have opened up coding to almost anyone by creating a natural language interface. But that extended ability always comes at a cost. When we outsource a particular task, our own ability to perform that task without assistance decreases. In some contexts, this may be unimportant. But in many situations, subtle as it may be, we will lose something valuable. We should be continually asking what that is.
As Ursula Franklin would say, we must look both at the enabling and foreclosing angles.
Start also by considering not only what a specific new technology does but also what that technology prevents. Because of that technology, what can no longer be done? Whether it’s road and railway, fax and postal system, computer learning versus other ways of learning, or CD versus book, one has to look both at the enabling and at the foreclosing angle of any way in which things are done. When teaching through new technologies we must highlight not only the skills the new devices bring but also the skills that are not developed because of the use of those devices. It’s often a very real trade-off. The advantages of the new technologies don’t come for free, and the cost is not only in money.
Got a thought? Leave a comment below.
Reading Recommendations
The title of this post comes from an excerpt from
’s book The Glass Cage that he posted on his blog. Here’s what the pilots told him about automation’s toll. While this post is over at his old blog Rough Type, Nick is now writing on Substack over at . Take a look and subscribe if you are so inclined!Even as they praise the enormous gains being made in flight technology, and acknowledge the safety and efficiency benefits, they worry about the erosion of their talents. As part of his research, Ebbatson surveyed commercial pilots, asking them whether “they felt their manual flying ability had been influenced by the experience of operating a highly automated aircraft.” Fully 77 percent reported that “their skills had deteriorated”; just 7 percent felt their skills had improved.
I really enjoyed this post from
from earlier this month over on . In many ways, it reminded me of what I wrote a while back about why AI Didn’t Make Homework Ineffective.It is my opinion that the main task for each teacher in confronting the challenge of AI is not to detect it, but to make the virtuous appeal to students that the only path to wisdom and knowledge goes through enduring hardship and persevering. I understand that many students, especially those with poorly formed prefrontal cortexes, simply don’t care about wisdom or virtue, but I think it’s wrong of us not to make an appeal to virtue at all. If we only talk in the language of discipline (“If you get caught, you’ll get punished!”), they won’t be reminded of their telos, they’ll only be reminded of danger. Whereas virtue reminds them of what they were created to be.
This week
over at had a conversation with my friend about writing and his new book More Than Words. In his reflections, Marcus reflects on why he loves writing and why loving writing matters.For me, maintaining the habit of writing about my experiences and perspectives on education has been an essential part of my own journey. It has helped me refine and sharpen my own ideas; it has deepened my understanding of my own development as a teacher; and it has offered a space to process and reflect upon my identity as a teacher over the years.
The Book Nook
I’m excited to be reading 10 to 25: The Science of Motivating Young People by UT Austin Professor of Psychology David Yeager alongside some of my colleagues from the engineering department this semester. Given what my life has been like over the last month my reading time has taken a significant hit, but what I’ve heard and read so far about this book has made me excited to dig in.
The Professor Is In
A group of my colleagues in the department were awarded an NSF grant last fall to develop new curriculum for using systems thinking as a framework to help foster students’ sociotechnical thinking. I’m glad to be working alongside folks who care deeply about these issues as I do and am excited to be a part of the work they are leading.
You can read more about the grant and its aims from the Harvey Mudd news release last fall.
Leisure Line
About 4 hours and three hundred screws later (I kid you not) the kiddos new playhouse is built assembled and open for business in our new back patio area. While the kids are missing their backyard space from Altadena, a new house is helping to make the new space feel more alive.
If you too are interested in making small children jump for joy, you can buy a house of your very own for the low, low price of $399.99 from my happy place.
Still Life
Yoto finally decided to do the right thing and issue new batteries for their players in response to the recall instead of simply providing new “smart” charging cables. Most people probably find it annoying to need to crack open the unit to replace the battery, but for an EE like yours truly this is a fun opportunity to see what’s under the hood.
Unsurprisingly there is an ESP32 in there, like probably 95%+ of other small internet-connected devices. In addition, you can see some good engineering choices with lots of test points broken out for debugging and some cute graphics on the silkscreen to boot.
Thanks for your thoughtfulness on this topic. I think (and read, and write—so far, just to myself and colleagues) about it a lot myself. I appreciate the comparison to piloting, which I’ve been thinking about in terms of Atul Gawande’s The Checklist, also. Thanks, too, for the sort of annotated bibliography of further reading at the end! It’s refreshing to hear someone (and an engineer, in particular) be cautious about this new technology.
"three hundred screws later (I kid you not)"
-- LOL, and Lovely for your kids! Would be interesting to see if a Boston Dynamics robot could assemble one, though?! Advantage: humans.
Thinking about skills and the fork in the road between refinement or atrophy --
Handwriting -> Typewriters -> Computers -> Tablets -> Phones -> Audio capture
User passwords -> Password managers for simple recall -> Password managers for creating 'strong' passwords
Manual cash registers -> Scanners
Cooks and chefs who gather and interpret sensory cues from food as compared with blind reliance on recipes and timers
Creators in many media determining by instinct, intuition, experience when a piece of art is done (or ready 'to be abandoned', as the saying goes)
Meteorologists surrounded by computer-generated models and data of all kinds who must distill that information into a few short forecast sentences every hour for regular folks to act on
The unfettered imagination of little ones in a playhouse
Always and everywhere: so much to gather, consider, balance