PLOS Pathogens is delighted to announce that we have refreshed our journal scope statement to reflect the growing and changing fields that…
By guest contributors Akhil Bansal and Stephen Luby
The availability bias, introduced by Nobel Laureates Amos Tversky and Daniel Kahneman in 1973, describes how we evaluate the likelihood of events depending on how easily accessible they are in our memory. As a result, singular, memorable moments have an overvalued influence over our future decisions. Further, our temporal or psychological proximity to an event, and how easy it is for us to integrate an event into our existing understanding of the world, also influence how likely we view that event or experience to be.
To test this, Tversky and Kahneman asked a group of study participants to think about the number of words that start with the letter k and compare this to the number of words with k as the third letter. ’Which is more common?’, they asked.
The pair found that twice as many people thought k occurred more frequently at the beginning of a word than in the third position. In fact, k appears far more often in the third position in a word. However, because words that start with k are easier to mentally access than words with k as their third letter, they were overestimated by the participants.
This availability heuristic affects individual physicians and the decisions they make. Several studies show that recent experiences with a particular diagnosis make physicians more likely to make the same diagnosis again, even if it is incorrect. Even reading Wikipedia articles about a certain disease several hours prior can make physicians more likely to misdiagnose it in their patients.
This availability bias also affects healthcare systems. Specifically, it is tantamount to how we process and learn from the COVID-19 pandemic. As countries begin to take a stronger hold of COVID-19 over the coming months and years, the conversation will shift towards what we can learn from COVID-19, and how we can stop something like it from happening again. But precisely in that framing lies the availability heuristic we must avoid: ‘something like it’. A framework tinted by the availability heuristic is frequently insufficient in forecasting, leaving us with low quality and incomplete information, which unsurprisingly leads to low quality and incomplete solutions.
Specifically, the next catastrophe that challenges healthcare systems on a scale similar to or greater than COVID-19 is difficult to predict, and more likely than not, will be of a different origin. Extreme weather events, water crises, weapons of mass destruction, nuclear war, engineered pandemics and artificial intelligence (amongst others) all pose similar if not greater risks than naturally occurring infectious diseases. The likelihood of most of these risks are increasing; for example, there is robust evidence that the probability and magnitude of extreme weather events is growing, water and energy crises are increasing at an alarming rate, and decreases in the cost and increase in potency of artificial intelligence increases the likelihood of their misuse and misalignment.
For a number of these risks, we either have direct evidence of how these threats could overbearingly burden healthcare systems, such as electricity and water shortages or extreme weather events, which are imminently threatening the functioning of healthcare systems. For other risks, there are conceivable and likely downstream effects of each that would stress healthcare systems. For example, nuclear war could produce quantities of ionizing radiation that could potentially threaten the risks of a significant proportion of the population, and development of lethal autonomous weapons could lead to mass casualty incidents on an unprecedented scale.
If we simply focus on avoiding ‘something like’ COVID-19 again, we risk equipping healthcare systems solely to deal with a future naturally caused viral infectious diseases. This approach may miss the larger picture of what we should learn from the COVID-19 pandemic; that there is a growing probability that our planetary health may be challenged at any time by threats that may come from a variety of different sources, whether they be biological, natural, technological, or societal. And regardless of where the next existential threat comes from, it may so strain the healthcare system that loss of life from its under-performance could be catastrophic.
Our healthcare systems should be planning for a wide range of potential challenges; one potential health policy approach to this is addressed by the notion of all-hazard preparedness, or an all-hazard approach (AHA). AHA is a framework based on the idea that different hazard scenarios share commonalities and can therefore be managed with a common plan for hazard mitigation and preparedness. AHA focuses on developing capacities and capabilities that are transferrable to a full spectrum of natural and man-made hazards. Some of its key considerations include developing food, water and energy security of hospitals, early detection systems, increasing surge capacity of human personnel, and improving countermeasure development and approval, infection control practices and equipment and communication with healthcare workers.
The benefits of AHA are multifaceted; it not only increases the likelihood of healthcare systems being able to respond to unexpected shocks (transformative capacity), it strengthens healthcare systems to more ‘everyday’ or ‘ordinary’ functioning (absorptive capacity) and builds resilience in times of greater need (adaptive capacity). This means that even if low probability, high-risk catastrophic events do not eventuate, an AHA is still a cost-efficient and effective approach for building a healthcare system stronger in its day-to-day functioning. Further, whilst building resilience to specific risks (a top-hazards approach) is still worthwhile, it is contingent on successful risk assessments. Since there is considerable uncertainty over which risks are most likely and thereby pose the greatest concern to healthcare systems, AHA presents a robust systems approach.
The World Health Organisation has developed a strategic framework for AHA. It describes four key policy areas for AHA, all of which require robust governance, capacity building and resource availability: operational readiness, health system resilience, one health and whole of government/whole of society systems and approaches. It recommends action across each policy area on community, national and global levels. Several governments have established AHA frameworks which have been implemented with success despite some concerns regarding its applicability. Both the evidence of the success and theoretical frameworks governing AHA support its merit as a worthy and viable approach to the threats humanity faces today and in its future.
Although we face a future with uncertain and increasing threats to its existence, a robust policy approach will be instrumental in building the resilience of our healthcare systems and safeguarding the longevity of humanity.
Akhil Bansal is a clinical doctor currently working at Charity Entrepreneurship, a nonprofit that identifies and incubates evidence-based and cost-effective interventions. He has a particular interest in global health and development, as well as policy. He previously worked in health system resilience and preparedness at both Stanford Existential Risk Initiative and at Oxford Pharmagenesis.
Professor Stephen Luby is the Director for Research of Stanford’s Center for Innovation in Global Health, a Professor of Medicine (Infectious Diseases) and a Senior Fellow to Stanford’s Woods Institute for the Environment and the Freeman Spogli Institute for International Studies. Prof. Luby has worked full time on research on low income countries for the last 25 years including living in Pakistan for 5 years and Bangladesh for 8 years. Prof. Luby is particularly interested in developing and evaluating approaches to counter the perverse incentives where people earn money by destroying the environment and health.