Table of Contents >> Show >> Hide
- The Simone Biles Moment: A Safety Decision in Public
- Now Put That Same Scenario in a Hospital
- Why the Double Standard Exists
- The Licensing and Credentialing Problem: “Get Help… But Also Prove You’re Not Risky”
- The Patient-Safety Argument Everyone Claims to Love (But Sometimes Ignores)
- So What Would It Look Like If Medicine Treated “Twisties” as a Safety Signal?
- What Patients Might Be Thinking (And Why They’re Not Wrong)
- The Real Takeaway: Biles Didn’t Lower the StandardShe Protected It
- Experiences From the Real World: What “Stepping Back” Looks Like in Medicine (About )
Imagine this headline: “Star physician steps away mid-shift for mental health and safety.” Now imagine the comment section.
In sports, we’ve (mostly) learned to clap when an athlete hits pause to prevent catastrophe. In medicine, we often treat the same
decision like a moral failure. Same human nervous system, same risk calculation, totally different cultural verdict.
That’s why the thought experiment “If Simone Biles were a doctor she would be vilified, not praised” lands like a well-timed
jab: it exposes a double standard we rarely admit out loud. Simone Biles stepped back in Tokyo when her mind and body weren’t
syncingand the world debated it, but the prevailing message became: safety matters, and athletes are people.
If a doctor did the professional equivalentrecognizing a dangerous disconnect and removing themselves from a high-stakes situation
they’d likely be labeled “unreliable,” “weak,” or worse: “abandoning patients.”
The irony is painful: medicine and gymnastics share a core ethicprecision under pressure. The difference is that gymnastics has
a built-in foam pit. Healthcare often has a pager that screams, “LOL good luck.”
The Simone Biles Moment: A Safety Decision in Public
During the Tokyo Olympics, Biles withdrew from several events after experiencing the “twisties,” a phenomenon gymnasts describe
as a frightening loss of air awarenessa mind-body disconnect that can turn a routine into a dangerous gamble. When you’re doing
elite-level twisting skills, “a little off” isn’t like missing a free throw. It’s like your GPS rerouting mid-air.
Biles’ choice wasn’t about a bad day or a bruised ego; it was about preventing injury to herself and protecting her team.
She didn’t pretend she could “power through” a problem that was explicitly about losing control while airborne. She made a call
that many safety experts would describe as responsible: stop before someone gets hurt.
And while she was criticized by some corners of the internet (because the internet is a hobby farm for hot takes), a large
share of the public conversation shifted toward respecting boundaries, acknowledging mental health, and valuing safety.
Biles later returned to compete in a modified waychoosing routines aligned with what felt safeanother detail that matters:
she didn’t “quit,” she recalibrated.
Now Put That Same Scenario in a Hospital
Replace the balance beam with a busy emergency department. Replace twisting disorientation with cognitive fog from severe sleep
deprivation, a panic spiral, or a medication side effect. Replace “vault” with “intubation,” “stitching,” or “a septic patient
crashing at 2 a.m.”
If a clinician recognizes, “I’m not safe to practice right now,” the ethical move is to step back and hand off care. But the
social reality can be brutal:
- Colleagues may see it as dumping work.
- Administrators may treat it as a staffing problem you personally “caused.”
- Patients may feel abandonedeven if the alternative is a clinician operating at half capacity.
- Regulators and credentialing processes can make help-seeking feel risky.
In sports, stepping out can be framed as protecting the body. In medicine, stepping out is too often framed as failing the role.
That isn’t just unfair; it’s a patient-safety issue dressed up as “professionalism.”
Why the Double Standard Exists
1) Sports has a scoreboard; medicine has ambiguity (and blame)
In gymnastics, the risk is visible and immediate: if you’re lost in the air, you can get hurt right now. In medicine, risk is
often probabilistic. A tired, distressed, or distracted doctor might still perform “fine” 19 times out of 20until the 20th time
becomes a headline. That ambiguity creates a nasty dynamic: we normalize pushing through because most days nothing explodes…
until it does.
2) The hero narrative is baked into medical training
Medicine has a long-standing culture of endurance: overnight calls, high-stakes decisions on little sleep, and a professional
identity that can quietly morph into “I must be invincible.” The result is a dangerous equation:
being human = being unfit. That’s how you get clinicians who feel guilty for getting sick, grieving, or needing a therapist.
3) The system is built on thin staffing and thick denial
Many healthcare environments run close to the edgebecause hiring enough people costs money, and burnout is cheaper (until it
isn’t). When coverage is fragile, stepping away doesn’t look like a routine safety protocol; it looks like a crisis.
So the culture learns the wrong lesson: “Don’t be the crisis.”
4) Medicine punishes uncertaintyyet demands perfection
Athletes can say, “I don’t have it today,” and the world understands (or is learning to). Doctors often feel they must say,
“I’m fine,” even when they’re notbecause uncertainty invites scrutiny, and scrutiny invites consequences.
The Licensing and Credentialing Problem: “Get Help… But Also Prove You’re Not Risky”
Here’s where the thought experiment gets sharp: if Biles were a doctor, her decision might not just be criticizedit could be
documented, interrogated, and retroactively reframed as evidence she shouldn’t practice.
Many clinicians have long worried that seeking mental health treatment could trigger complicated disclosure requirements or
licensing questions that feel intrusive. National medical organizations and researchers have argued for applications that focus
on current impairment (what affects safe practice now) rather than a broad history of diagnosis or treatment.
The goal is simple: encourage care-seeking and protect patients by reducing stigma and fear.
Some reforms are underway: guidance and advocacy efforts emphasize that questions should be limited, supportive in language,
and designed to identify functional impairmentwithout punishing people for responsibly getting care.
A major point repeated by multiple organizations is that protecting the public doesn’t require prying into every past therapy
appointment; it requires ensuring clinicians are safe to practice today.
If you want an analogy: a gymnast isn’t banned from competition because they once wore a brace. They’re evaluated on whether
they can safely perform now. Medicine should work the same way.
The Patient-Safety Argument Everyone Claims to Love (But Sometimes Ignores)
Healthcare leaders talk about “patient safety culture” constantlyoften while creating conditions that make safety harder:
chronic understaffing, punishing schedules, and a subtle expectation that clinicians will sacrifice themselves to keep the
machine running.
We actually have plenty of evidence that healthcare workers have faced worsening mental health strain and burnout in recent
years, and that fatigue and burnout can affect performance. Accreditation and safety organizations have published guidance on
fatigue management and duty hours precisely because fatigue is not a personality flaw; it’s a predictable risk factor.
When a system routinely places people in high-stakes roles while exhausted, stressed, or unsupported, that system is rolling
dice with patient outcomes.
In other words: the “Simone Biles move” in medicinestepping back when you recognize you’re not safeshould be seen as a safety
behavior, not a character defect.
So What Would It Look Like If Medicine Treated “Twisties” as a Safety Signal?
Create a real “pause button” for clinicians
In many hospitals, the “pause button” exists only in theory. In practice, stepping back can trigger guilt, retaliation, or an
impossible workload shift to colleagues. A real pause button means:
- Backup coverage protocols (like aviation redundancies, not “call your friend and beg”).
- Clear, non-punitive pathways to say: “I’m not safe right now.”
- Leadership messaging that frames stepping back as risk prevention.
Fix the incentives that reward silence
If the system punishes honesty, people will learn to lie. Not because they’re unethical, but because they’re trying to keep a job.
Normalize help-seeking by removing fear-based barriers in credentialing and licensing processes and by offering confidential support.
When people can get care without career panic, patients benefit.
Invest in “just culture,” not “gotcha culture”
A just culture recognizes that safety failures often come from system conditions, not just individual negligence.
When clinicians make mistakesor nearly make themsupport matters. So does learning. A punitive culture teaches people to hide
problems. A learning culture surfaces them before harm occurs.
Make scheduling and staffing a safety strategy, not a spreadsheet trick
Burnout isn’t solved with a free yoga app and a poster that says “Self-care!” (Although, sure, yoga is nice.)
It’s solved when staffing levels, workflow design, and workload expectations stop treating human beings like rechargeable
batteries you can store in a hot garage and expect to perform at peak capacity.
What Patients Might Be Thinking (And Why They’re Not Wrong)
If you’re a patient reading this, you might be thinking: “Okay, but if my doctor steps away, what happens to me?”
That concern is valid. Continuity matters. Trust matters. Waiting is scary. And the healthcare system has a habit of making
patients pay for its staffing failures.
The answer isn’t “patients should just understand.” The answer is: the system should be designed so that a safe handoff is
routine, not catastrophic. In healthy systems, when someone steps away, care continues smoothly. In fragile systems, stepping
away triggers chaosand then everyone blames the person who noticed the danger.
We can do better than that. We must.
The Real Takeaway: Biles Didn’t Lower the StandardShe Protected It
Simone Biles didn’t ask the world to redefine excellence as “anything goes.” She modeled something harder:
the discipline to stop when continuing becomes unsafe. That is not softness. That is professionalism.
If she were a doctor, the safest, most ethical choice might still be to step back. But our medical culture often treats that
choice as betrayal rather than risk management.
The goal isn’t to turn medicine into a feelings seminar (though a little emotional literacy wouldn’t kill us).
The goal is to align incentives with safety: reward the clinician who raises a hand and says, “Not safe,” the way we reward the
pilot who refuses to fly a broken plane.
We’ve started learning this lesson in sports. It’s time to apply it where the stakes are even higher.
Experiences From the Real World: What “Stepping Back” Looks Like in Medicine (About )
The stories below are composite scenarios drawn from commonly reported experiences in healthcaredetails changed, patterns
keptbecause the point isn’t one person’s identity. It’s the system’s reflex.
1) The resident with “twisties,” medical edition
A resident finishes an extended shift and realizes they’re reading the same lab result three times and still not processing it.
The feeling is less “sleepy” and more “my brain is buffering.” They consider telling the senior physician they’re not safe to
do a procedure. But they hear an old training mantra in their head: “You’ll be fine.” They imagine the eye-roll. The evaluation.
The subtle label: not tough enough.
So they push throughuntil a nurse catches a near-miss medication order. The resident feels relief mixed with nausea:
“That could’ve been real harm.” And the lesson their body learns isn’t “ask for help early.” It’s “don’t get caught.”
2) The attending who knows the right calland hates it
An attending physician wakes up with crushing anxiety after weeks of relentless staffing gaps. They’ve been functioning, but the
fuse is short nowsnapping at colleagues, struggling to focus, feeling the edges of panic during routine decisions.
They do the responsible thing: they call the department to ask for coverage.
The response isn’t cruelty, exactly. It’s worse: logistical despair. “We don’t have anyone.” “Can you come in for half a day?”
“What if you just do the easier cases?” The physician hears the unspoken translation: “Your needs are inconvenient.”
They come in anyway, because abandoning colleagues feels unbearable. The shift is technically “fine,” but their empathy is gone.
They go home hollow. The system got a body in a white coat. The patient got less of a human.
3) The therapist question that changes behavior
A clinician finally schedules counseling after months of insomnia and intrusive work stress. Then they remember a credentialing
form from a previous job that asked about mental health treatment in a way that felt broad and vaguely threatening.
They don’t know what they’d have to disclose. They don’t know who would see it. They don’t know whether “getting help”
gets interpreted as “being unsafe.”
So they cancel the appointment. Not because they don’t believe in therapy, but because they believe in rent.
This is how stigma becomes operational: not as a rude comment, but as a quiet deterrent that keeps people from care.
4) The “Biles protocol” that actually works
In a rare bright spot, a clinic creates a true backup system: a floating clinician for same-day coverage, protected time for
handoffs, and leadership that explicitly praises stepping back when safety is compromised.
One day, a clinician says, “I’m not okay today; I need to tap out.” The response is immediate and non-dramatic:
“Got it. We’ve got you. Go take care of yourself.”
Patients still get seen. The team doesn’t collapse. And the clinician returns latersteadier, safer, and fiercely loyal to a
workplace that treated them like a person instead of a replaceable part.
That’s the point: when systems are designed for humanity, safety improves for everyone.