HealthManagement, Volume 25 - Issue 5, 2025

img PRINT OPTIMISED
img SCREEN OPTIMISED

Medical error is a systems issue, not incompetence. Recognising three victim groups, making open disclosure routine and applying human-factors thinking within a Just Culture build safety. Belgium’s liability rules deter reporting; a sanction-free national scheme with legal immunity and feedback loops is proposed. Mediation centres turn grievances into learning, while leadership’s ‘golden triangle’ aligns mission, culture and practice to normalise reporting and strengthen compassionate care.

 

Key Points

  • Medical errors arise from system failures, not individual incompetence.
  • Patients, clinicians and organisations are three victim groups that require support.
  • Open disclosure and human-factors methods underpin a Just Culture.
  • Belgian liability rules deter reporting; a sanction-free national scheme is proposed.
  • Mediation and leadership’s ‘golden triangle’ turn grievances into learning and safer care.

 

Introduction

Medical errors are among the leading causes of preventable harm in healthcare. Yet, behind every mistake lies a story—not of incompetence, but of human vulnerability in a high-stakes environment. The vulnerability to error is amplified by the growing complexity of healthcare systems, where a patient interacts with an average of 17.8 to 26.6 different professionals during a single hospitalisation (Whitt et al. 2007; Schaad et al. 2025). Doctors, expected to perform flawlessly during chronic understaffing, information overload and emotional fatigue, are often the last to receive understanding when an error occurs.

 

For a long time, the analysis of medical errors was shaped by a cultural bias that focused on individual blame rather than systemic learning. Studies show that professionals often refrain from reporting incidents because of fear of personal repercussions, mistrust in institutional responses or the perception that such reports will not lead to meaningful change. This narrow focus on individual responsibility not only stigmatised those involved but also obscured underlying organisational weaknesses that contributed to the event (Waring 2005; Evans et al. 2006). Instead of asking “What went wrong?” the system too often asks “Who is to blame?” This culture of fear and silence not only discourages learning but contributes to burnout, moral injustice and workforce load (Asakawa et al. 2022).

 

As the healthcare sector struggles with increasing staff shortages and declining retention, addressing the root causes of human error becomes urgent. Mistakes must be seen not as failures of character, but as opportunities for reflection, redesign, innovation and growth. In this article, we examine the nature and frequency of medical errors, discuss the psychological toll on clinicians and explore systemic reforms rooted in compassion (culture), transparency (mission) and leadership. The latter forming a golden triad. By shifting the focus from blaming and shaming to learning, we can build safer environments—for both patients and those who care for them.

 

Definitions and Conceptual Framework

To understand why doctors make mistakes, it is essential to move beyond technical competence and examine the broader psychosocial, organisational and human factors or thus the non-technical skills at play in healthcare systems (Wu 2000). Errors are rarely the result of individual negligence; instead, they emerge from complex interactions between people, systems and environments.

 

First, Second and Third Victims

First victim: the patient harmed by a medical error.

Second victim: the healthcare provider involved, who may suffer guilt, shame, anxiety or burnout (Wu 2000).

Third victim: the institution, which may experience reputational damage and loss of trust.

These concepts underscore the wide-ranging consequences of errors and the need for system-wide approach and support, not punishment (Figure 1).

 

 

Definitions of Key Terms

Medical error: A failure in the execution of a planned medical action or the use of a wrong plan to achieve an aim. Errors can occur at any stage of care (diagnosis, treatment, prevention) and may or may not cause harm (Reason 2000).

 

Fault: A moral or legal attribution of blame to an individual for a harmful act or omission, often implying negligence or intent. It differs from 'error' which may arise without intent or negligence.

 

Incident: An unintended event during the care process that caused, could have caused or might still cause harm to a patient, staff member or material.

 

Near miss: An unplanned event that did not reach the patient or result in harm due to timely intervention or sheer luck. Near misses often become valuable learning opportunities.

 

Near fatal: A subset of near misses in which the incident could have reasonably resulted in death if not intercepted.

 

Complication: As defined by the Dutch Internists’ Association, it is an unintended and undesirable event or condition during or following medical-specialist action that negatively affects the patient's health and necessitates an adjustment to treatment or leads to irreversible harm (Nederlandse Internisten Vereniging n. d.).

 

Serious Adverse Event (SAE): A medical occurrence that results in death, is life-threatening, requires hospitalisation or prolongation of existing hospitalisation or results in persistent/significant disability.

 

Punishment: A disciplinary or legal actions aimed at individuals, such as blame, sanctions or stigma, which suppress openness and hinder learning from errors.

Understanding these distinctions is critical for effective reporting, learning and system improvement.

 

Open Disclosure

Open disclosure refers to the transparent communication of medical errors to patients and their families. Despite its ethical imperative, many clinicians avoid it due to fear of litigation or professional consequences, particularly in unsafe systems lacking legal protections for honest disclosure. As highlighted by Eniola and Gambino, removing fear through structured disclosure frameworks is essential to support clinicians and foster transparency (Eniola et al. 2019).

 

According to institutional protocols for post-incident communication, the core principles of open disclosure include (RIBI AZ Oudenaarde n. d., NVZ n. d.):

  • Acknowledging that something has gone wrong,
  • Providing a factual explanation of what happened (as far as known),
  • Describing immediate actions taken to care for the patient eg by adapting existing protocols or action plans,
  • Expressing regret or an official apology,
  • Outlining steps taken to prevent recurrence as the proof of a learning environment.

 

Healthcare providers are encouraged to use clear, compassionate language and offer continued support, including the involvement of mediators or patient advocates where appropriate. Open disclosure also includes internal documentation and, in certain cases, notification of external bodies (eg liability insurers, regulatory agencies).

 

Open disclosure is not just an ethical obligation—it is a foundation for rebuilding trust, reducing litigation and learning from failure. It reflects a mature safety culture where transparency is valued over self-protection. We will discuss how the legal system may affect this open disclosure procedure.

 

 

Human Factors and Ergonomics

Human Factors and Ergonomics (HFE), also referred to as non-technical skills (NTS), is an established discipline that studies how people interact with tools, systems and environments to improve safety and performance (Dekker 2014). Originally developed in aviation, HFE is increasingly recognised in healthcare for its critical role in patient safety.

 

Key HFE domains include:

  • Situational awareness: understanding and anticipating what is happening in the environment,
  • Decision-making: selecting appropriate actions from multiple options under uncertainty,
  • Communication: clear, effective and transparent information exchange among teams and patients,
  • Teamwork: collaboration toward shared and common goals,
  • Leadership: guiding and supporting teams, especially during crises,
  • Resilience: coping with stress and bouncing back after adversity,
  • Serendipity: the finding of something unexpected and useful while looking for something completely different,
  • Empathy: understanding the emotional experience of others,
  • Agility and innovation: adapting quickly and creatively to new challenges,
  • Learning: using feedback and past experience to change behaviours and systems.

 

As Della Torre et al. argue, these skills must be taught and integrated through practice, leadership modelling and a culture of reflection (Della Torre et al. 2021). HFE is not an optional “add-on” but central to safe care.

 

Patient Advocacy

Patient advocacy plays a vital role in bridging the gap between healthcare providers and those they serve. Advocates help ensure that patients and families are not only informed but also actively involved in decisions after adverse events. Beyond individual support, advocacy groups amplify patients’ voices to promote transparency, fair compensation and systemic learning. Their presence can counterbalance institutional inertia and cultural resistance to disclosure, ensuring that safety reforms remain patient-centred. By engaging patient advocates in reporting systems, mediation centres and policy design, healthcare organisations strengthen trust, enhance accountability and align safety initiatives with the lived experiences of those most affected.

 

Incidence and Types of Errors

Medical errors are alarmingly common and represent a significant cause of preventable harm and mortality worldwide. Estimates suggest that in Europe, medical errors are the third leading cause of death in hospitals, following cancer and cardiovascular disease. Studies show that per hospital admission, patients experience one to two medication-related errors on average (Malbrain et al. 2020; Ho et al. 2020).

 

In Belgium, according to figures cited by Dr. Michel Bafort, more than 2,000 people die annually due to preventable medical errors, exceeding the number of deaths from road traffic accidents (Bafort 2019; Bafort 2020). A 2017 survey by vzw “Medisch Falen” found that 70% of 2,500 nurses admitted to making at least one error per year (ibid). Such statistics likely underrepresent the true scope due to underreporting and cultural taboos.

 

Common types of medical errors are as follows:

  • Diagnostic errors: delayed, missed or incorrect diagnoses,
  • Medication errors: wrong drug, dose, timing or route,
  • Communication failures, particularly during handovers or in interdisciplinary teams,
  • Procedural mistakes: lapses in surgical or bedside techniques,
  • System-related issues: staffing shortages, inadequate training, poor IT systems.

 

The COVID-19 pandemic intensified these vulnerabilities, exposing gaps in preparedness and communication, while amplifying stress, fatigue and burnout among frontline workers. These pressures created ideal conditions for human error.

 

Understanding the incidence and nature of errors underscores the urgent need for systemic interventions, particularly those rooted in human factors and safety science. The focus must shift from identifying individual blame to recognising recurring patterns and structural deficiencies.

 

From Blame to Understanding

In many healthcare settings, the immediate reaction to an error is to seek the person responsible (Wu 2000; Reason 2000). This “name, blame and shame” culture has persisted for decades, yet it is profoundly counterproductive. Root cause analysis from safety-critical industries shows that most errors are not the fault of individuals but of systems.

 

Instead of focusing on "who," we must ask "why." Why did the decision make sense at the time? What pressures were present? Were protocols ambiguous, training inadequate or workflows flawed? These questions shift the focus from retribution to learning and prevention.

 

A Just Culture, as promoted by safety experts such as Sidney Dekker and James Reason, seeks to balance accountability with a non-punitive response to error. This approach recognises the difference between human error, at-risk behaviour and reckless behaviour. Only the last warrants disciplinary action. The others demand system redesign and education.

 

Changing from a blame culture to a learning culture requires more than policy—it demands psychological safety, where staff feel empowered to speak up without fear of reprisal. It also requires leadership commitment, clear reporting pathways and reinforcement that mistakes are learning opportunities.

 

A Mistake Does Not Mean You Are Incompetent

For many clinicians, making a mistake triggers a deep sense of failure. Medicine often selects for perfectionists—individuals who are driven, conscientious and highly self-critical. In such an environment, equating error with incompetence is not uncommon. However, this mindset is deeply flawed and dangerous (Leape 1994).

 

Errors occur in all complex systems. Pilots, engineers and even AI algorithms make mistakes. What differentiates high-performing systems is not the absence of error, but the ability to detect, respond to and learn from it.

 

We recently came across the following post on LinkedIn from an Australian surgeon Oliver M. Fisher, entitled “When You Tempt the Gods of Complications” (Fisher 2025):

 

Clinicians must be reassured that acknowledging an error is a sign of integrity and professionalism—not a mark of weakness. Creating a culture where errors can be disclosed and discussed openly is essential for both individual well-being and patient safety.

 

Organisations must therefore invest in:

  • Peer support programmes,
  • Second victim support services,
  • Regular debriefings and morbidity & mortality (M&M) rounds with a learning focus.

This shift in mindset is vital not only to improve safety but also to protect the mental health of healthcare workers, reducing burnout and enhancing retention.

 

Leadership, Mission and Culture – The Golden Triangle

Effective leadership plays a pivotal role in promoting safety, managing errors and fostering a culture of trust and transparency. The "golden triangle" of leadership, mission and culture is a powerful framework that aligns people and processes (Figure 3):

  • Leadership defines the behaviours and values modelled from the top down, setting the tone for psychological safety and system learning.
  • Mission gives meaning and direction, reminding staff why their work matters, particularly during adversity.
  • Culture is the sum of shared behaviours, attitudes and values that determine whether people feel safe to report, reflect and improve (Malbrain et al. 2020, Malbrain 2024).

 

The golden triangle of leadership, vison and culture should support teamwork at all organisational levels via the 7 Cs (Commitment, Coaching, Cooperation, Communication, Coordination, Cognition, Conflict Resolution) by creating commitment for results, ie quality improvement and increasing value for the stakeholders. Normally, the relational aspect takes about 75% of time and efforts compared to the rational one. A cultural broker can help to increase and facilitate results and reduce efforts.

 

As outlined by Malbrain and Rosseel during the COVID-19 crisis, culture and behaviour are the greatest barriers to transformation in healthcare. Behavioural change is difficult because people rarely act unless they feel urgency or discomfort. Leaders must model the standard, giving direct, respectful feedback; like a virus, their behaviour spreads quickly through an organisation.

 

In their crisis leadership strategy, Malbrain and Rosseel emphasised tools such as (Malbrain et a. 2020):

  • A clear, connecting narrative explaining both current challenges and future direction,
  • Single Line of Command (SLOC) to streamline decisions,
  • Shifting from rigid to iterative processes,
  • Empowering creative teamwork and distributed leadership,
  • Humility and listening as core traits of adaptive leadership.

 

Insights from “The Future of Critical Care” (De Waele et al. 2020) underscore that human capital is the cornerstone of quality care (Ho et al. 2020). Critical care requires holistic attention to team wellbeing, multidisciplinary synergy and burnout prevention. Leadership must balance technical expertise with emotional intelligence, adaptability and strategic foresight.

 

This demands more than generic governance, but professional, well-supported Chief Medical Officers (CMOs) who act as gatekeepers of patient safety. However, the role of the CMO is often compromised by outdated decision architectures, lack of autonomy and financial or political interference (Malbrain 2023). Despite being responsible for quality and patient safety, CMOs frequently lack the mandate, legal protection and structural authority to implement corrective action (Malbrain et al. 2023).

 

To safeguard patient outcomes, hospitals must:

  • Embed co-governance models where CMOs are equal partners alongside CEOs and CNOs,
  • Grant protected time, resources and staff support for CMOs,
  • Introduce credentialing pathways and leadership training for CMOs and department heads,
  • Enshrine the CMO’s decision-making power in matters of safety into law or policy.

 

 

In short, the triangle only functions when each point reinforces the others: leadership without culture becomes authoritarian; mission without leadership is empty rhetoric; culture without direction risks stagnation. A resilient healthcare system demands all three—united through courageous, adaptive leadership, with the CMO at the helm of patient safety.

 

The Role of the Cultural Broker

A "cultural broker" serves as a bridge between frontline staff and hospital leadership, translating values, mediating communication and aligning diverse interests toward a common safety goal. In safety-sensitive settings, this role is critical to building trust. Cultural brokers help dismantle hierarchy-driven silence and ensure that the realities of daily clinical work inform systemic decisions (Malbrain et al. 2020, Ho et al. 2020).

 

Cultural brokers can:

  • Identify misalignments between policy and practice,
  • Encourage upward feedback without fear,
  • Promote respectful dialogue and conflict resolution.

 

During COVID-19, cultural brokers proved crucial as organisations faced a behavioural and cultural crisis alongside the medical one. As outlined by Malbrain and Rosseel, true transformation in healthcare starts not with protocols, but with behavioural change (Malbrain et al. 2020). Leaders must lead by example, offering consistent feedback and modelling safe behaviour. In this context, cultural brokers became key figures in communicating a "connecting story", explaining both the current reality and the desired future and mapping the path in between (ibid).

 

The experience at UZ Brussel demonstrated how cultural brokers can support leadership in guiding departments through rapid shifts (De Waele 2020). They helped implement adaptive frameworks like Single Line of Command (SLOC), move from rigid to iterative strategies and encourage creative teamwork (Malbrain et al. 2020). Their presence encouraged open communication, humility and honest reflection—all critical for navigating uncertainty.

 

Cultural brokers are especially important in breaking through the inertia of traditional hospital hierarchies, which often resist change due to power structures, ego and institutional rigidity (ibid). Their mission is not to impose new values, but to facilitate alignment between the hospital’s strategic vision and frontline realities. As Malbrain and Rosseel note, true resilience requires a clear shared vision, collective leadership and an open culture where curiosity replaces control (ibid).

 

Lessons from Aviation

Aviation has long been a benchmark for safety culture. After a series of catastrophic accidents in the 1970s, the industry embraced crew resource management (CRM), non-punitive incident reporting and checklists to improve safety. These principles helped transform aviation into one of the safest high-risk industries (Maillé 2023).

 

The lessons for healthcare are clear:

  • Normalise reporting of near misses and unsafe conditions,
  • Use structured debriefings after critical events,
  • Adopt simulation training to develop non-technical skills.

 

Arnaud Maillé, a Boeing 747 captain and instructor in non-technical skills, emphasised these points during a recent Belgian conference on Quality and Patient Safety in Intensive Care (ibid). According to Maillé, both aviation and healthcare share key features: complex environments, heterogeneous teams, hierarchical structures and the critical importance of communication. In both sectors, authoritarian leadership, lack of cohesion and unclear roles undermine safety.

 

Maillé argues that under pressure, communication is the first casualty—leading to tunnel vision and serious mistakes. Yet humans, unlike machines, are adaptive. In fact, in 85% of cases, people detect and correct errors within seconds. The goal is not to eliminate all errors but to prevent small errors from escalating into catastrophic events.

Maillé’s core message echoes James Reason: stop asking who made the mistake and start asking why (Reason 2000). Blame does not improve safety. Instead, healthcare must embrace a permanent, preventive analysis of small failures. This requires empowering everyone—from senior clinicians to junior staff—to report incidents, no matter how minor. Every deviation is a learning opportunity.

To build this culture, Maillé underscores the need for a Just Culture, where individuals acting in good faith are not punished (Maillé 2023). Fear of informal sanctions, reputational damage or social exclusion often suppresses reporting. Instead, he advocates for the normalisation of error as a route to expertise and organisational improvement.

 

He warns that latent factors—such as time pressure, staff shortages, checklist fatigue and frequent interruptions—are particularly common in hospitals. These organisational weaknesses create traps that can ensnare even competent professionals. Thus, human error is a symptom of system failure, not individual incompetence.

 

Finally, Maillé stresses the importance of situational leadership and team synergy. Effective leaders promote psychological safety, encourage open communication and adapt their leadership style to the needs of the moment. Ego, he warns, is deadly. Healthcare must replace egocentric, autocratic leadership with humility, listening and shared responsibility. Only in such an environment can the full potential of team-based care and patient safety be realised.

 

The Belgian Context: Why Doctors Don’t Admit Mistakes

In Belgium, structural and legal constraints actively discourage physicians from disclosing mistakes. As noted by Belgian gynaecologist Dr Michel Bafort, admitting an error can jeopardise professional liability insurance, fostering fear, silence and underreporting (Bafort 2019; Bafort 2020).

 

Bafort has argued that medical incidents are the third or fourth leading cause of hospital deaths, surpassing traffic accidents. Estimates suggest more than 2,000 preventable deaths annually in Belgium, echoing international findings that errors kill more people than car crashes.

 

Bafort proposes a national, sanction-free incident reporting system (“veilig melden”), inspired by aviation, to enable:

  • Safe, anonymous error reporting,
  • Multilevel learning (unit, hospital, national),
  • Legal immunity for timely self-reporting.

 

Across publications and policy proposals, Bafort advocates for a system that would require both internal safety (no disciplinary action within the hospital) and external safety (legal immunity from prosecution based on the report itself) (Bafort n. d.; Kohn et al. 1999; Hilfiker 1984; Leape 2002). His model calls for the creation of a Federal Council for Safe Medical Practice and the transformation of the Fonds voor Medische Ongevallen into a Fonds voor Medische Incidenten (FMI), focusing not on blame but on learning and fair compensation.

 

Early pilot projects in Belgium, such as one at AZ Alma, demonstrated that once the threat of sanctions was removed, incident reporting rose dramatically, confirming the “iceberg” of hidden errors. Such a national initiative could transform Belgian healthcare by embedding safety and transparency at every level.

 

Yet, Bafort’s proposals have not been without controversy. Patient advocates and critical observers have voiced concerns that “sanctievrij melden” or “safe incident reporting” might primarily serve the interests of liability insurers, shielding physicians while leaving patients with long and costly legal battles for compensation.

 

Solutions for Safe Incident Reporting and Complication Registration

The key to meaningful change lies in creating systems that encourage reporting and drive collective learning. This includes:

  • Anonymous, non-punitive reporting platforms,
  • Local and national complication registries to track trends,
  • Regular feedback loops to frontline teams.

 

These tools must be supported by strong governance, legal protection and a culture that prioritises improvement over punishment. Learning from incidents should be built into daily operations—not reserved for major crises. Hospitals must move from "compliance" to curiosity, asking: What can we learn today to do better tomorrow?

 

However, structural challenges remain. Chief Medical Officers (CMOs) often lack the tools, protection and authority needed to implement safe reporting systems effectively (Malbrain et al. 2023). Despite being held accountable for quality and patient safety, CMOs frequently operate without protected time, formal leadership training or a clear legal mandate. Moreover, many CMOs navigate conflicting demands between hospital boards, financial imperatives and clinical realities. The “split mandate” undermines their ability to prioritise patient safety.

 

To address this, a national safe reporting and learning system must:

  • Be led or co-designed by clinical leaders, including CMOs,
  • Include protected legal status for reporters and reporting institutions,
  • Offer transparent, multi-level feedback and benchmarks,
  • Operate under the umbrella of a neutral, independent oversight body.

 

Similarly, Belgian discussions on “Arts in Nood”, “Médecins en difficulté”, a Belgian independent organisation providing help to healthcare workforce that face psychological problems, stress that mediation should extend beyond patients to include the struggles of clinicians themselves. In this view, the same methodology of listening, anonymisation and systemic analysis can identify organisational failures that harm both patients and providers and create conditions for mutual trust and safer care.

 

The ideal system would:

  • Guarantee immunity from internal and legal sanctions for prompt reporting,
  • Separate reporting channels from legal documentation,
  • Include multiple feedback loops, from unit to national level.

 

Alongside Bafort’s “veilig melden” or “safe reporting”, Professor Béatrice Schaad provides a complementary model in Switzerland (Schaad 2024). She emphasises that effective safety systems must also address the relational and communicative dynamics between staff, patients and families. These are safe zones where patients, clinicians and hospital staff can discuss incidents collaboratively with trained mediators. The aim is not to assign blame, but to restore trust, detect quality and safety issues thanks to in-depth interviews, resolve conflict and promote healing.

 

Rather than presenting the Lausanne model as the first to establish mediation centres, it is more accurate to emphasise its innovation in linking mediation to a systematic grievance collection and coding process. Unlike models focused mainly on repairing interpersonal relationships, the Lausanne approach recognises that patients and families often identify underlying quality and safety issues. By applying a taxonomy originally developed at the London School of Economics (Reader et al. 2014) and refined through their own study (Schaad et al. 2015), the team is able to transform grievances into targeted improvement projects. This methodology, recently published in NEJM Catalyst (Schaad et al. 2025), has also evolved to incorporate testimonies from healthcare professionals who experience difficulties with patients, thereby expanding its scope. These narratives now inform a broader range of improvement initiatives, addressing both patient and staff concerns within the hospital system.

 

Complementing structural reforms, Schaad and colleagues at Lausanne University Hospital developed a mediation centre that systematically collects and codes grievances from patients, relatives and professionals. Rather than treating complaints as threats, this model reframes them as opportunities for quality improvement projects and for restoring trust through dialogue. Importantly, grievance data feed back into hospital leadership, enabling systemic learning beyond individual blame (Schaad et al. 2025).

 

Recent work by Schaad further highlights that patient and professional grievances are not merely complaints but powerful signals of systemic dysfunction (Schaad 2025). Her analysis emphasises that most patients (≈61%) seek explanations rather than financial compensation and that narratives of dissatisfaction often reveal blind spots invisible to classical safety tools such as incident reports or satisfaction surveys. The Lausanne mediation centre has systematically coded over 6,000 experiences since 2012, showing how grievances can be transformed into targeted improvement projects.

 

Schaad advocates for embedding mediation into hospital governance as an alternative to adversarial litigation.

 

Together, Bafort’s structural model, Schaad’s mediation approach and Malbrain’s golden leadership triangle form a synergistic solution: technical safety mechanisms aligned with humanistic, restorative practices. Belgium can take inspiration from all three, creating a safety culture that protects not only patients, but also caregivers and institutions.

 

Conclusion

Doctors do not make mistakes because they are careless or incompetent. They make mistakes because they are human (“to err is human”), operating under intense pressure in complex, often flawed systems. The only rational response to this truth is not punishment, but partnership—between clinicians, institutions and patients—to foster learning, empathy and improvement.

 

By shifting from a punitive mindset to one of inquiry and support, we protect not only our patients but also our healthcare workforce. The courage to confront human error with humility and openness is the first step toward a safer, more compassionate future in medicine.

 

Conflict of Interest

None.


References:

Asakawa M, Imafuku R, Kawakami C et al. (2022) Promoting a culture of sharing the error: A qualitative study in resident physicians' process of coping and learning through self-disclosure after medical error. Front Med (Lausanne), 9:960418.

Bafort M (2019) Arts wil meldpunt voor medische incidenten: ‘Ze eisen meer levens dan het verkeer’ [in Dutch]. De Standaard.

Bafort M (2020) Missers zijn onvermijdelijk, maar sanctievrij melden kan levens redden [in Dutch]. Knack.

Bafort M (n. d.) Dosser medische incidenten [in Dutch] (accessed: 03 October 2025). Available from ordomedic.be/nl/nieuws/dossier-medische-incidenten

De Waele E, De Mol J, Malbrain MLNG et al. (2020) Adaptive Strategies for Intensive Care During the Spread of COVID-19: The Brussels Experience. ICU Management & Practice, 1:20–27.

Dekker S (2014) The Field Guide to Understanding 'Human Error'. Ashgate Publishing.

Della Torre V, Nacul FE, Rosseel P et al. (2021) Human factors and ergonomics to improve performance in intensive care units during the COVID-19 pandemic. Anaesthesiology Intensive Therapy, 53(3):265–270.

Eniola K & Gambino C (2019) Taking the Fear Out of Error Disclosure. Fam Pract Manag, 26(6):36.

Evans SM, Berry JG, Smith BJ et al. (2006) Attitudes and barriers to incident reporting: a collaborative hospital study. Qual Saf Health Care, 15:39–43.

Fisher OM (2025) When You Tempt the Gods of Complications (accessed: 13 October 2025). Available from linkedin.com/posts/a-prof-oliver-m-fisher-md-phd-fmh-fracs-5b893097_surgery-resilience-training-activity-7348311143678939136-oqEr/

Hilfiker D (1984) Facing our mistakes. N Engl J Med, 310(2):118–22.

Ho S, Wong A, Butnar A et al. (2020) The Future of Critical Care: A Human Capital Perspective. ICU Management & Practice, 3:220–225.

Kohn L, Corrigan JM & Donaldson MS (1999) To Err is Human: Building a Safer Health System. Institute of Medicine (accessed: 03 October 2025). Available from nap.nationalacademies.org/read/9728/chapter/1

Leape LL (1994) Error in medicine. JAMA, 272(23):1851–1857.

Leape LL (2002) Reporting of adverse events. N Engl J Med, 347(20):1633–8.

Legemaate J (2005) Kwaliteit van leven meten in economische evaluaties [in Dutch]. Nederlands Tijdschrift voor Geneeskunde; 149(22), 1203–1205.

Maillé A (2023) Ego is dodelijk – Lessons from aviation. Presentation at Belgian Quality and Patient Safety in Intensive Care conference (accessed: 03 October 2025). Available from mediquality.net/be-nl/topic/article/25891048/ego-is-dodelijk-het-advies-van-een-piloot-over-het-verbeteren-van-de-veiligheid-in-ziekenhuizen

Malbrain MLNG (2024) Blauwdruk voor M&M-discussie op basis van complicatieregistratie [in Dutch]. AZ Oudenaarde.

Malbrain MLNG & Rosseel P (2020) De tweede golf kunnen we nog net aan, maar wat erna? (We can just handle the second wave, but what thereafter?) [in Dutch] De Standaard, 23 June:26–27.

Malbrain MLNG et al. (2023) Verslag van ronde tafel meeting van Hoofdartsen. Nota Brainstorm: herziening van de rol en positie van de CMO [in Dutch].

Nederlandse Internisten Vereniging (n. d.) Complicatieregistratie [in Dutch] (accessed: 03 October 2025). Available from internisten.nl/complicatieregistratie/

Nederlandse Vereniging van Ziekenhuizen (NVZ) (n. d.) MS Veiligheidsprogramma. Open Disclosure Protocol [in Dutch] (accessed: 03 October 2025). Available from nvz-ziekenhuizen.nl/vms-veiligheidsprogramma

Pomey MP, Schaad B, Lasserre-Moutet A et al. (2024) Towards a New Integrated Model for Taking Into Account the Experiential Knowledge of People With Chronic Diseases, Integrating Mediation, Therapeutic Education and Partnership: The Expanded Chronic Care Patient-Professional Partnership Model. Health Expect, 27(5):e70054. doi: 10.1111/hex.70054. PMID: 39373129; PMCID: PMC11456963.

Reader TW, Gillespie A & Roberts J (2014) Patient complaints in healthcare systems: a systematic review and coding taxonomy. BMJ Qual Saf, 23:678–689.

Reason J (2000) Human error: models and management. BMJ, 320(7237):768–70.

RIBI AZ Oudenaarde (n. d.) Nieuwe en oude procedures voor incidentmelding [in Dutch] (accessed: 03 October 2025). Available from azoudenaarde.be/patient/geef-feedback/incidenten

Schaad B & Caci M (2025) Improving Medicine Using Grievances Collected at a Mediation Center. NEJM Catalyst Innovations in Care Delivery, 6(5).

Schaad B (2025) Réinventons les soins grâce à celles et ceux qui s’en plaignent. Georg éditeur.

Schaad B, Bourquin C, Bornet F et al. (2015) Dissatisfaction of hospital patients, their relatives, and friends: Analysis of accounts collected in a complaints center. Patient Educ Couns., 98(6):771–6.

Vincent C, Stanhope N & Crowley-Murphy M (1999) Reasons for not reporting adverse incidents: an empirical study. J Eval Clin Pract, 5:13–21.

Waring JJ (2005) Beyond blame: cultural barriers to medical incident reporting. Soc Sci Med, 60:1927–1935.

Whitt N, Harvey R, McLeod G et al. (2007) How many health professionals does a patient see during an average hospital stay? N Z Med J, 120:U2517.

Wu AW (2000) Medical error: the second victim. The doctor who makes the mistake needs help too. BMJ, 320(7237):726–7.Pomey MP, Schaad B, Lasserre-Moutet A, Böhme P, Jackson M. Towards a New Integrated Model for Taking Into Account the Experiential Knowledge of People With Chronic Diseases, Integrating Mediation, Therapeutic Education and Partnership: The Expanded Chronic Care Patient-Professional Partnership Model. Health Expect. 2024 Oct;27(5):e70054. doi: 10.1111/hex.70054. PMID: 39373129; PMCID: PMC11456963.