Thank you for being here. As always, these essays are free and publicly available without a paywall. If you can, please consider supporting my writing by becoming a patron via a paid subscription.
Happy one-week-until-tax-day for those of you who celebrate! Tax season is a reminder of the perpetual human temptation to get as close to ethical lines as possible without crossing them. It’s human nature to search for ways to check the box with the least possible effort required.
The search for the line between ethical and unethical behavior is of course not limited to our taxes. These days we're also hearing a lot of discussion about the ethical uses of technology, and in particular, AI.
These discussions of ethical principles for the development and use of technology are important. We should be building consensus around the ways to ensure our technology benefits society, respects human rights, and minimizes potential harm to ourselves and our world.
But what if developing and using technology ethically wasn't enough? Principles like fairness, respect for privacy, safety, accessibility, and sustainability are good and important for our flourishing and the good of the world around us. The principles are important guidelines that our technology should meet or exceed. But just like tax law, it’s our natural impulse to frame ethical guidelines in such a way that we optimize for satisfying the requirements without exceeding them.
What if the question of whether a particular technology is ethical or unethical is the wrong one? Ethical principles don't come from nowhere. They are developed from the careful inquiry of deeper philosophical questions like:
What does it mean to be human?
What makes a good life?
What are my responsibilities to those around me?
What responsibilities do I have to myself?
What is the nature of reality?
The problem with much of our conversation around the ethical development of technology is that it doesn't go deep enough. All too often we stop at the layer of ethics. We ask what is the minimum required instead of chasing a deeper vision of human flourishing. What if we didn't see the ethical use of technology as something that outlines the bounds of what we can do, but we saw it as a guiding force that points us to a deeper and bolder vision of what we should do?
We've got to stop viewing ethical considerations as a list of constraints to be satisfied with as little margin as possible. To do that we've got to change the whole ethos of our approach to our work. The ethical way is the baseline. To go deeper, we need a redemptive reframe.
The Absent-Minded Professor is a reader-supported guide to human flourishing in a technology-saturated world. The best way to support my work is by becoming a paid subscriber and sharing it with others.
The Redemptive Frame
The idea of redemptive applications of technology is fresh in my mind coming off a great forum that I participated in last week in NY hosted by Praxis. Over the course of two days, a small group of us gathered to brainstorm a redemptive thesis around AI.
If you come into the Praxis orbit for any significant period of time, it won't be long before you’ll hear about The Redemptive Frame. This concept is the guiding principle for the work that Praxis does to cultivate new entrepreneurs and their ventures. But even if you wouldn’t describe yourself as an entrepreneur, we all exercise entrepreneurial functions in our work, especially if you work in the knowledge work sector.
The Redemptive Frame is a way of understanding our work and its impact. While it applies more broadly to our careers, it casts an important vision for us to consider as we think about technology, and AI in particular.
The Redemptive Frame is built around three axes that describe different aspects of our work:
What we build (Strategy)
How we build (Operations)
Why we build (Leadership)
While this framework provides a helpful guide for Praxis to use with the entrepreneurs they work with, it is valuable for all of us as we pursue meaningful work.
These three axes describe the aspects of our work but don't make any normative judgments about the answers to these questions. This is where the three shells help provide an evaluative structure by delineating between the exploitative, ethical, and redemptive ways.
To illustrate, let's see how these categories might guide us in analyzing AI technologies and applications.
Exploitation is easy to find
It's pretty easy to find examples of exploitative AI. Exploitation tends to be our default mode as humans if we were to remove the influence of our consciences. Here are a few examples along each axis.
What: Applications like deepfake pornography or products that allow malicious actors to deceive others.
How: Using data without the proper permissions or ignoring the biased nature of the underlying data.
Why: Developing AI with the sole goal of maximizing profits at all costs without considering the potential societal or environmental impacts.
If only these examples were hypothetical. Unfortunately, we needn't look long or hard to find real-life examples where exploitation is happening. Our response to exploitative applications should be to expose them and work to eliminate them wherever we can.
Ethical is a minimum standard
The next shell describes an ethical approach. Conversations here are around ways to develop and use AI that meet the standards of fairness, transparency, and respecting the privacy of users. Here are some examples in the ethical shell.
What: AI systems that improve our ability to detect and filter spam online or to otherwise reduce content that is used to mislead and deceive.
How: Training AI systems with data that has been properly licensed or collected with appropriate considerations of the rights of users. It could also include building AI systems with a focus on transparency and explainability and actively working to ensure the outputs from AI models are fair and just.
Why: Developing AI models to help improve human flourishing and address societal challenges like using AI in medicine to assist doctors in giving the best care possible to all of their patients.
In our world today, the ethical shell is often seen as the highest aim. Ethical approaches seek a win-win situation and strive to find ways to develop technology such that profits for the developer are matched by benefits provided to the end user. We certainly should hold ourselves and our corporate entities to ethical standards.
But is the pursuit of ethical considerations enough? What would it look like to consider an even more audacious goal for our work?
Redemptive should be our goal
This brings us to the final shell, the redemptive way. Working in the redemptive way goes beyond thinking ethically to pursue repair and restoration. Redemptive applications recognize the brokenness in our world and seek to address and redress it. Where ethical applications are content with not making things worse, redemptive applications are aimed at making things better. They are aimed at closing gaps, giving sacrificially, and renewing our world. The redemptive way is motivated by a focus on others above ourselves. Here are a few examples.
What: Using AI to create technologies to foster positive community engagement. One could envision using AI to build a tool to help make it easier for groups of friends to resolve the many logistical hurdles that prevent us from gathering in person, like coordinating our schedules to find available times without lots of back-and-forth messaging and suggesting potential activities that would help us cultivate community. This application demonstrates a redemptive vision by using technology to support the good of in-person community instead of replacing it.
How: Intentionally involving the end users of AI technology in the design process to hear directly from stakeholders and to ensure that the tools can address their needs. Redemptive applications are built in a way that goes beyond ensuring people are not harmed. It seeks to bless them.
Why: Designing products and business models that from the outset are designed to make sacrificial tradeoffs. For example, one could imagine sacrificing profit to more effectively serve and give back to users. This could also look like specifically designing technology for users who have been historically neglected or ignored.
Our world needs the redemptive way
We don't need to look far to find the brokenness in our world. The unfortunate truth is that it's often much harder to find signs of hope.
The brokenness we see is a call to redemption. We must not accept anything less than ethical applications, ways of building, and reasons for building, but settling for ethical norms doesn't go far enough. Stopping at the ethical ignores the opportunity we have to use our resources to invest in redemptive ways.
The redemptive way is the way of sacrifice. It will cost us something. We'll need to leave potential gains on the table, expose ourselves to additional risk, slow down to do things the right way and avoid shortcuts, and emotionally invest ourselves in our work in ways that will stretch us and sometimes even break us.
As a follower of Jesus, the redemptive way is an invitation for me to walk the way of the cross. It's a call to sacrifice my ambitions to serve others. To put aside my priorities to put others’ concerns before my own. It is a costly path, but one full of meaning.
I'm well aware that many of you reading this likely do not share my Christian convictions and that is ok. As I've been reminded time and time again, there is plenty of space to build partnerships and coalitions across our disagreements by embracing our shared principles and vision of the good life. I hope that you might consider how the redemptive frame might reframe the way you think about the vision of a flourishing life.
Our world does not lack opportunities for redemptive action. What is always in short supply are the people willing to embark on good, bold, redemptive quests to see the redemptive action move out into the world. Join me in the journey of pressing beyond the ethical and striving to invest in boldly redemptive and sacrificial ways for our good and for the good of our neighbors.
Recommended Reading
If you want to learn more about the redemptive way, I would commend the following resources from Praxis:
In addition, here are a few pieces that crossed my path this week worth sharing.
This was a fantastic read from
on his journey to faith.I enjoyed this piece from
writing about some themes similar to the questions I’ve raised today.This piece from
on the complicated role of faculty positions.The Book Nook
After getting through the first part of the book where McLuhan lays the foundation, I’m enjoying dipping into some of the specific analyses in the second part.
Here are a few quotes that I’ve highlighted as I’ve been reading.
The Greek dramatists presented the idea of creativity as creating, also, its own kind of blindness, as in the case of Oedipus Rex, who solved the riddle of the Sphinx. It was as if the Greeks felt that the penalty for one break-through was a general sealing-off of awareness to the total field.
Any invention or technology is an extension or self-amputation of our physical bodies, and such extension also demands new ratios or new equilibriums among the other organs and extensions of the body.
And a technological extension of our bodies designed to alleviate physical stress can bring on psychic stress that may be much worse.
No society has ever known enough about its actions to have developed immunity to its new extensions or technologies. Today we have begun to sense that art may be able to provide such immunity.
It is a persistent theme of this book that all technologies are extensions of our physical and nervous systems to increase power and speed.
The Professor Is In
It’s time for the first deployment in E80 this week where students take their robots to the lake just north of campus at the Bernard Field Station. Exciting that we are getting closer day by day to the launch at Dana Point on the 20th of April!
Leisure Line
We’ve got some new fish in the Brake household as of yesterday. A new pink tetra and a pleco to join our two small neon tetras.
Still Life
The rain clouds spoiled the middle days of my trip to NY, but I enjoyed being in the city regardless. No place quite like it.
This perspective is very interesting. Especially the theme of necessary sacrifices, which is often underestimated. P.S. I also have a trip to NYC planned, I hope to find less cloudy weather!
Josh, you frame this intriguing essay as moving beyond ethics. But another way to describe this movement would be to re-expand our definition of ethics--which has been whittled down in the modern era. In the western philosophical tradition, the first great work of ethics comes from Aristotle, and that book doesn't present ethics in the way that people use that term today (and the way you use it here), as ensuring we do the right thing in different contexts. Aristotle begins his argument by noting that humans seek happiness (or what we might better translate as a well-lived or flourishing life); the point of ethics is to create a flourishing life. Ethics thus moves well beyond making decisions about rules; it is intimately tied to our vision of a good life.