Psychological Safety is the Lifeblood of a Healthy Team
Reflections on how the psychologically unsafe culture at Theranos led to its demise and how we might chart a different path

While it’s often possible to see the importance of something by its presence, sometimes its absence can be even more instructional. This last week I finished Bad Blood, the gripping story of the rise and fall of Theranos by journalist John Carreyrou. After finishing the book I also watched the excellent HBO documentary The Inventor which adds additional commentary with recordings and videos of many of the main events in the story. John’s story about the literally “too good to be true” blood testing startup that went belly up in 2018 after rising to meteoric heights only a few years earlier reads like a thriller.
In addition to being nearly impossible to put down, the story is a great lesson about the vital importance of a healthy team culture. In particular, the lack of psychological safety within the company created the perfect conditions for the numerous coverups that ultimately led to Theranos’s massive implosion.
Theranos: Revolutionizing healthcare at any cost
If you’re not familiar with the story, here’s the short summary: Theranos was founded in 2003 by a then 19-year-old Stanford dropout named Elizabeth Holmes. The goal of the company was to revolutionize healthcare by developing a new diagnostic system which would allow blood tests to be performed with a few drops of blood sampled using a finger prick instead of the standard venous draw. In the ensuing years, Holmes built a massive company, growing Theranos to a valuation of over $1 billion in large part by leveraging an extensive network of rich and powerful connections including former high-ranking government and military officials.
Unfortunately, there was a catch. And a big one at that, especially for a medical device company. The device they developed did not, in the slightest, work as described. While you would have no idea from Holmes’ impassioned and steadfast public statements to the contrary, Theranos was only able to get a small subset of the tests they advertised to work on their new devices. Despite this, Holmes and her romantically-involved second in command Sunny Balwani continued to double down on the technology, never giving an inch even to straightforward and direct public questions about their devices. In many situations, their responses amounted to boldfaced lies about what was actually going on behind the scenes.
I won’t rehash the whole book, but suffice it to say that Theranos was riddled with deception and lies, propped up by a seductive and compelling vision and an enrapturing CEO founder in Holmes.
While the story is engaging on its own terms, there’s a lot to be learned from an analysis of the root causes that led to Theranos’s collapse. The story is a cautionary tale of the danger of taking the “fake it until you make it” mindset too far. Perhaps more than any other single cause, the lack of a psychologically safe environment was the disease at the root of Theranos’s demise and what ultimately led to its catastrophic collapse.
What is psychological safety?
I first learned about the concept of psychological safety soon after starting my faculty position at Harvey Mudd College. In a nutshell, a psychologically safe team is one where its members can take risks, raise concerns, or admit mistakes without the fear of negative consequences. Succinctly, in the words of Amy Edmondson, author of The Fearless Organization and an expert on the topic, psychological safety is “felt permission for candor.”
As I’ve thought more about this and how it has been both present and absent in teams I've been on or advised, it's become clear to me how important this quality is for a team to reach its full potential. In Bad Blood, there are many examples of how the lack of psychologically safety at Theranos was pervasive and had real consequences. My hope is that by analyzing the problems that stem from its absence you might reflect on how to make the teams you are on more psychologically safe and avoid a similar fate.
Psychological safety supports divergent thinking
One of the most important outcomes of psychological safety on a team is a robust support for divergent thinking. Divergent thinking and diverse perspectives are critical for creativity and important parts of a high functioning team. This idea surfaced for me this past week as I was listening to a conversation on the Tim Ferriss show with Michael Mauboussin, an investment strategist, author, and professor at the Columbia Business School.
One part of the conversation that really resonated with me was when Michael shared the reasons behind the phenomena of the wisdom of crowds. The basic idea is that in particular settings, the collective wisdom of a group of people is better than any one member’s. For a quick example, check out this 4-minute video where my Harvey Mudd colleague Talithia Williams demonstrates this concept by visiting the LA Fair and performing an experiment with passersby and a jar of jellybeans.
The wisdom of crowds is an important consideration when composing a team of problem solvers. By building a team where the individual members come from a wide variety of backgrounds and attack problems from different angles, you can tap into this powerful collective wisdom.
But there is a catch. In order for this advantage to materialize, you need to support and encourage different perspectives. This is where psychological safety plays a role, enabling and encouraging individuals to share their contrary opinions.
Theranos built a passionate and driven team, but the lack of a psychologically safe environment cut the potential strength of the bright and interdisciplinary team off at the knees. The paranoia of Holmes and Balwani flowed down throughout the ranks of the company and impacted every employee. Teams across the company were not allowed to talk, siloing the engineering team from the chemists developing or running tests at the bench. Holmes and Balwani squashed any negative feedback and rebuked employees for raising concerns, leading to a culture of fear and a revolving door of resignations. Theranos lacked psychological safety from top to bottom.
The wisdom of crowds is a powerful advantage supported by a diverse collection of perspectives. But if the team culture doesn’t support and encourage this diversity by fostering psychological safety, the advantage is lost and a toxic culture of shame takes its place.
The Theranos story is a word of caution for unbounded optimism and a reminder of the right framing for the prototyping mindset
As an engineer I naturally see the world though the lens of block diagrams, transfer functions, and control systems. One critical part of designing effective engineering systems is incorporating feedback signals to provide information about the difference between the actual and desired outputs and adjusting the inputs to close the gap. Open loop control, which exists without these feedback signals, is fundamentally flying blind because we have no idea whether the actions we are taking are moving us closer to our desired state.
This picture of a closed-loop feedback system is fundamental to the prototyping mindset. As a reminder, the prototyping mindset is a way of developing solutions to the challenging and often ill-posed problems in life by approaching them with an iterative process of experimentation, reflection, and adjustment. Since the prototyping mindset relies on a process of iteration and leverages feedback to make adjustments, it’s critical that it operates as a closed loop system. Before we connect this back the situation at Theranos, let’s take a quick detour to sketch out open and closed loop control systems to make this point clear.
Open and closed loop systems are a helpful, albeit limited, analogy for understanding team dynamics
The diagram above illustrates an open-loop control system. An input indicating the desired output is applied to the controller, generating an action which leads to an output via the process block. However, there is no way for us to know whether the input we are providing is moving us closer or farther away from our desired output.
In contrast, a closed-loop system builds on this architecture by introducing a feedback loop and two new components: a summing junction and a measurement device. The input and output of the system is still the desired and actual output respectively, but now we make a measurement of the actual output signal and compare it to the desired output (the input to the whole system on the left hand side). Then, we use this information about the difference between our actual and desired output as an input to the controller, which uses this information to change course appropriately.
You interact with closed loop control systems like these every day. Cruise control in your car is one of the most common examples, computing the difference between your desired and actual speed and adjusting the signal to the throttle accordingly. Another is the thermostat in your house or oven which measures the current temperature and then makes an adjustment to the heating or cooling element in order to bring the temperature to the desired level you set.
However, the control system analogy can be extended beyond these engineering examples. It also applies to the problems we try to solve across the various domains of life. We want to maintain a healthy weight, so we modify our diet and exercise in order to get us closer. We want to build stronger relationships so we reflect on the current state of our friendships and make adjustment to spend more intentional time connecting with our loved ones.
While the control system analogy is helpful, it gets much more complicated when applied to social systems. In the thermostat example, the temperature sensor doesn’t yell at the controller to make an adjustment the way team members might grapple with each other after a challenging conversation. While the underlying framework of monitoring and incorporating feedback is still helpful, in the context of social systems things are much fuzzier. People aren’t machines with a nice mathematical description. Outputs and inputs aren’t a simple physical signal that cleanly describes a metric of interest with a known level of confidence.
Theranos collapsed because it ran on an open loop system. Holmes and her leadership team knew the desired output state but willfully blinded themselves to the many red flags in their technology. In developing a culture of paranoia instead of one of psychological safety, they crippled any systems of control and feedback, short-circuiting the value of the feedback they could have used to learn and adjust their course of action.
A psychologically safe environment is a critical component of a team culture because it helps to ensure that the closed-loop control system is stable. Most directly, it helps to maximize the ability of the control system to correct to errors between the current and desired state. While it’s highly unlikely that the culture of your teams are as toxic as the ones described in Bad Blood, it’s worth pondering the responsibility we each have to foster psychologically safe environments—for the good of our organizations, colleagues, and ourselves.
The Book Nook
My book for this week is Bad Blood by John Carreyrou. The cover blurb from the New York Times Book Review on my copy says that it “reads like a thriller” which is a truly fitting description. This is a must read for anyone interested in engineering ethics and tech.
Selected memorable quotes
A snippet from Erika Cheung in her email to Elizabeth Holmes sharing her deep concerns with the Theranos technology and Elizabeth’s management style.
I am [sic] only hope that somehow I bring awareness to you that you have created a work environment where people hide things from you out of fear. You cannot run a company through fear and intimidation…it will only work for a period of time before it collapses.
The author reflecting on the crux of the issue: a deeply flawed and broken company culture.
The biggest problem of all was the dysfunctional corporate culture in which it was being developed. Elizabeth and Sunny regarded anyone who raised a concern or an objection as a cynic and a naysayer. Employees who persisted in doing so were usually marginalized or fired, while sycophants were promoted.
The Professor Is In
This Friday before everyone left for spring break we wrapped up the first half of E80 and launched the final project. For their final project, the students need to design a robot which they will launch into the bay at Dana Point, CA to perform an autonomous data collection mission. As long as the robot includes three unique sensors on it and can autonomously collect data for at least one minute, the students are pretty much free to design and build their robots however they would like.
The photo above is from the brainstorming session where students worked in their teams to brainstorm what they want their robots to measure. It was a great way to end the first half of the semester with lots of energy and excitement in the room.
Leisure Line
This week I added a few new fountain pens to my arsenal in addition to my trusty Pilot Metropolitan. I picked up a LAMY Safari (love the Caltech orange—officially Terra Red—color) and a clear TWSBI Eco. So far I'm enjoying them both, but am really loving the TWSBI Eco.
Still Life


Stumbled on a pretty big tree which toppled over in a neighborhood a few miles south of us after the heavy windstorms a few weeks ago. Thankfully it looked like the damage to the house wasn’t too extensive, but wow, that was a big tree.