Over the past year, the explosion of AI tools like ChatGPT and its ilk have rocketed to the top of our list of societal conversation topics related to technology. There are as many opinions about AI and its influence as there are people to have them. Whatever you think about AI tools, there’s no escaping that they will have a significant influence on our future. And not only on our future as individuals but on our life together in community.
While at first pass it might be tempting to see AI tools as an entirely new enterprise, I’m hesitant to embrace this perspective. There are certainly aspects of Large Language Models (LLMs) and their applications that are truly new. We’ve never had chatbot interfaces that could respond to us with such convincing prose. But there are other aspects that are dependent on other existing technical and political resources that aren’t new per se.
For example, the fact that LLMs like GPT-4 can compose text so convincingly is dependent on the dizzying (and ever-expanding) amount of information accessible through the internet needed to feed them. By the same token1, these LLMs also require an almost unimaginable amount of computing resources to do something with this information. Even a decade ago this level of computational horsepower would have been hard to imagine.
After the dust settles on the technological “wow factor” and you begin to look more closely, you’ll begin to see that LLMs and the AI revolution raise some of the same questions that have been raised by technologies throughout history, from the invention of fire all the way down through the printing press and the semiconductor. Questions like:
After we create a technology, how does it shape us?
What are the moral and ethical implications embedded within a given technology?
How does a specific technology influence us collectively after reaching widespread adoption?
How does a specific technology influence us as individual users?
What virtues and vices does a given technology support?
These are timeless questions that live outside the purview of technology development and are squarely in the domain of philosophy. An LLM can’t tell you anything about whether you ought to develop it. It’s easy to figure out what technology will do for you, but even more important is to ask what it will do to you.
Today I want to unpack a few of these questions and spend some time explaining how I think they can help us to develop our own philosophy of technology. My hope is that this approach might stimulate our curiosity and serve both as a way for us to understand what a particular technology is and guide us as we interact with it as users and designers.
But first, what is technology?
It can be tricky to actually get your arms around what technology is. On the one hand, in normal conversation in the 21st Century when we talk about technology, we’re often thinking specifically about digital devices. If you read a technology policy for an classroom or event, it’s often specifically thinking about how we are going to engage with phones and other devices. But this is only a small part of what technology encompasses.
One definition of technology that I’m workshopping these days is any artifact created to shape the world. There are a few important parts of this definition:
Technology is an artifact. It’s something that is external to us. As an artifact, technology includes, but is not limited to physical objects. While this definition includes physical objects like hammers, pens, and books, it also includes more abstract artifacts. For example, an algorithm to sort numbers is a technology as are grading systems.
Technology is created. Technology is the new application of existing ideas or materials. It is something that we build. In creating it, we are engaging in a problem-solving process: seeing something in the world and then designing an artifact to address it.
Technology shapes the world. Technology is not a neutral invention. As an artifact created to address a specific problem or challenge, it is designed to have an influence. It doesn’t just exist on its own but is connected with the world by shaping it, in both physical and more abstract ways. Concrete is a technology that shapes the world by enabling the construction of roads and buildings. Film is a technology that allows us to capture the pattern of light intensity and store it.
With these aspects of technology in mind, here are five questions that I use to consider my own philosophy of technology. I hope they are helpful to you as well as you think about your perspectives and practices with respect to technology.
1. After we shape a technology, how does it shape us?
We often think that the design and use of technology is a one-way street. We’re steering the car or guiding the horse, not the other way around. But in reality, the technology shapes us as well.
This idea reminds me of an image from
’s book The Righteous Mind which he uses to describe how we make decisions. We often think we’re in control as rational actors, but the truth is often more complicated. The metaphor is that our decision-making process is like a rider on an elephant’s back. As the rider, we think that we are making rational decisions, steering our faithful pachyderm where we want it to go. But the truth is, more often than not, we’re in service to the elephant and where it wants to go. In this frame, our role is more post hoc justification than thoughtful and intentional trip planning.So it goes with technology. We often create technology with a certain goal in mind, but along the way, we find ourselves fighting for control of the steering wheel. We want to use our email as a convenient tool to communicate with our colleagues but it ends up turning us into email-checking addicts. We want to use our phones to take photos of our kids and store our memories but instead, we end up missing out on the moments as we are ironically trying to capture them.
All technology is like this in some way. We shape the tools. Thereafter they shape us. We need to be aware of this tendency both as we use and design tools.
2. What are the moral and ethical implications embedded within a given technology?
Is technology neutral? A fun topic for dinnertime conversation, I’ve been increasingly coming to the conclusion that the answer is no. While most technology is controlled by a human who bears the responsibility for the ethical implications of its use, there is a burden borne by the designers in this process as well. The particular design of a tool can be prone to make it more or less suited to support positive or negative ends.
While we may design a technology with only the best intentions in mind (e.g., a hammer for driving nails to build a house and not to be used as a weapon), as designers and users we bear a responsibility to think about how the features of our tools might be used in ways that we don’t intend and if there are particular features of the design that incline it toward those unintended use cases. As we design tools we need to consider the various ways in which our end users will use the technology and, at the very least, consider the potential for our devices to be used in harmful ways.
3. How does a specific technology influence us collectively after reaching widespread adoption?
One of my favorite examples from Ursula Franklin’s lectures on The Real World of Technology is how she shows how the impact of a specific technology on a group of people can be very different than its impact on an individual. She writes about how the sewing machine, originally intended as a tool to liberate, insidiously ended up supporting exploitative practices.
Manufacturers and promoters always stress the liberating attributes of a new technology, regardless of the specific technology in question. There are attempts to allay fear, to be user-friendly, and to let the users derive pride from their new skills.
…
The authors of this prognostication evidently assumed that the introduction of the sewing machine would result in more sewing — and easier sewing — by those who had always sewn. They would do the work they had always done in an unchanged setting.
…
Sewing machines became, in fact, synonymous not with liberation but with exploitation.
She further notes that this is often not a function of the technology itself, but of the surrounding systems and infrastructures which facilitate their use.
What turns the promised liberation into enslavement are not the products of technology per se — the car, the computer, or the sewing machine — but the structures and infrastructures that are put in place to facilitate the use of these products and to develop dependency on them.
This is an aspect we often don’t think about as we’re designing technology: what happens when this technology becomes part of a larger system of use? How might the advantages for an individual when used by that individual end up flipping on their head when they are used within a larger more coordinated structure?
4. How does a specific technology influence us as individual users?
Of course, in addition to the influence of technology when used in a larger system, there is also an impact on individual users as well, both in isolation and as members of a society. Perhaps nothing illustrates this more poignantly today than the smartphone.
The idea of the smartphone seems almost universally positive: it facilitates communication with our family and friends even across great distances, allows us to capture photos and videos of the things that we see and wish to remember, and puts what would have been an almost unimaginable amount of computational power even ten years ago in our pockets, accessible 24 hours a day, 7 days a week.
But as we know, the story isn’t nearly so positive in practice. What seems like it would have a strongly positive influence has in many cases turned out to be exactly the opposite. We’re back to Franklin: tools with aspirations of liberation have become imprisoning. We’re addicted to our devices in a way that keeps us apart from others. The temptation of always-on connection ironically means that we’re often disconnected from the people who are right in front of us. And this not only keeps us separated from those around us but also has a way of distracting us from being connected to our own being with the constant distraction of push notifications buzzing and beeping their way into our lives.
While it’s important to consider how a given technology influences us collectively, we also need to consider how the particular patterns of being that it supports and encourages form us.
5. What virtues and vices does a given technology support?
As we think about the impact of our technology, it’s worth thinking about it within the framework of virtues and vices. This can be a helpful way to decide if and how we want to engage with technology, for our individual flourishing and the flourishing of our neighbors.
Virtues describe those qualities that lead to a life of human flourishing, ideas like love, curiosity, perseverance, patience, kindness, honesty, and integrity. How does our technology use foster these? And to think in the opposite direction, how do technologies create fertile ground for our vices to grow like weeds? How do the technologies that we use foster greed, envy, malice, hatred, and violence?
Approaching the world with this lens in mind can be illuminating. For me, my car often fosters my impatience. Traffic frustrates my development of the virtue of patience and fosters anger. The speed and power of the car mean that I’m focused on getting from point A to point B in the shortest time possible. It doesn’t encourage me to develop curiosity and pay attention to what is flying by the window.
We all need a philosophy of technology
In our tech-centric world, it’s becoming increasingly important for all of us to develop a philosophy of technology. The truth is that we develop practices and ways of using technology whether we consciously think through them or not. But if we don’t stop to ask the questions, someone will answer them for us and their answers may not be aligned with our flourishing as individuals or as a community.
I hope that this list of questions can be a helpful place for you to start developing your own philosophy of technology. As you reflect on the questions I’ve listed above, what stands out? What dimensions are missing from my list? Which question is the most incisive for you and your practices? I’d love to continue the conversation over email or in the comments below.
The Book Nook
We just finished The Sentence is Death by Anthony Horowitz last week for our murder mystery book club. This was the second book in the Hawthorne and Horowitz series, following up on the first one which we read a few months ago, The Word is Murder.
These are fun murder mysteries to read, well written with a lot of clues to help you put together the pieces. It’s also fun because of Horowitz’s style of writing himself into the novels as the narrator. A lot of the story refers to other screenplays, TV shows, or books that he’s written which keeps you guessing what is fiction and what is real life.
The Professor Is In
A core piece of the Harvey Mudd mission statement is to help foster STEM leaders who are able to wrestle with the impact of their work on society. While I’m more partial to the language of influence as compared to impact, I deeply resonate with the value of wrestling with the impact of our work on society. This includes the ways that the work that we do may have unintended or ill-considered consequences and the responsibility we bear as individuals and as organizations for both the positive and negative influence of our work.
Over my first four years at Mudd, I’ve felt a hunger from students for more of these conversations. I’ve also felt the need to continue to wrestle with these issues myself.

Toward that end, this semester I decided to prototype a new weekly reading group with a few Mudders. Every Monday we get together and squeeze in a quick lunch together between classes, discussing a short reading together. We started with Ursula Franklin’s 1989 CBC lectures, The Real World of Technology, and are working in a few readings from each of the speakers coming to campus this fall for the Nelson Distinguished Speaker Series. Yesterday we read and discussed Jaron Lanier’s essay in the New Yorker, “There Is No A.I.”
I’m so grateful to have the chance to learn alongside these thoughtful and curious students and to dig deeper into the philosophical edge of our work as engineers!
Leisure Line
Boy, do I have a weak spot for donuts. We’re fortunate to have a great donut place near us in Pasadena, Randy’s Donuts. Here are our selections from a trip last Saturday. Was quite a feat to hold the kids off to get this picture!
Of course, Los Angeles has a long history of great donut places. If you’re curious about some of the history of donuts in LA and particularly about the many Cambodian-owned shops around, I highly recommend the documentary The Donut King which is available to stream on Hulu. It’s a thoughtful and moving story, detailing the story of Ted Ngoy (aka, the Donut King) and the way that he changed the landscape of donuts in SoCal forever. I’ll stop there to avoid giving more away, but the story also has a deeply human edge which is deeply moving and thought-provoking.
Still Life
As the nights get colder, I’m enjoying the chance to get the Solo Stove out. I love the solitude of sitting by the fire to think and listen to the quietness of the night.
Pun intended, in case you’re wondering. https://learn.microsoft.com/en-us/semantic-kernel/prompt-engineering/tokens
Your excellent thoughts on technology reminded me of an old Arthur C. Clarke quote: “Before you become too entranced with gorgeous gadgets and mesmerizing video displays, let me remind you that information is not knowledge, knowledge is not wisdom, and wisdom is not foresight. Each grows out of the other, and we need them all.”
Those are great questions. I think the important thing is for people to stop and think about what all this change means. You're helping by posing some general questions to get the idea flow going here, so you'll have my support.