Solving the dementia dilemma

Written by Coco Newton


Remarkably, only 1 in 10 people reading this know that dementia is the leading cause of death in the UK.

It has been since 2015 when it overtook heart disease in men, and even earlier since 2011 for women. Ageing populations mean the number of people living with dementia is expected to reach one million by 2025, and of children born today, a third will likely develop it within their lifetime [source].

We are only beginning to comprehend the complex causes of this silent global trend and age is just a risk factor - not a cause. In fact, such is the magnitude of recent scientific discoveries in dementia that researchers, doctors, and public health bodies alike are struggling to agree on what dementia even means and how we should diagnose it.

In this blog post, I explore why our choice of a definition for dementia can have a potentially devastating impact on how we live with it, and why a radical rethink to traditional approaches is required.

Changing definitions

Dementia refers to a clinical condition resulting from a brain disease like Alzheimer’s or Parkinson’s that causes premature brain cell death. It involves a slow loss of cognitive function such as memory, planning, emotions, and personality over the course of several years (dementia comes from the Latin demens, meaning ‘out of one’s mind’).

For most affected by it, the dementia journey is one of enduring uncertainty and sorrow. In contrast to every other terminal illness, dementia presents two deaths – the mental and the physical – and affects entire communities for years as people begin to get lost around neighbourhoods, become unable to manage daily chores, and eventually require full-time care: with many remaining hopelessly in denial.

To me, the cruellest discovery in modern-day medical research is that dementia diseases begin much earlier than we ever anticipated.

New blood tests for Alzheimer’s disease markers can be positive in people as young as 50, who have no symptoms at all. This silent ‘preclinical’ disease phase can last several decades as brain cells slowly begin to die and dementia becomes apparent. Not only does dementia deprive us of our hope for a dignified old age, but also of a care-free midlife. Alarmingly, studies suggest that 10% of people in their fifties might have preclinical Alzheimer’s disease, rising to 40% of people aged 90 or older [source: Chételat et al., NeuroImage Clinical, 2013]. Our most feared disease has suddenly moved a lot closer to us.

Where do we draw the line?

But the story is also complicated: not all people with midlife disease markers will go on to develop dementia.

In a study in Sweden, a third of people that tested positive for Alzheimer’s disease remained symptom-free when followed up nine years later. Even ‘super agers’, those with a cognitive ability of individuals several decades younger, often show evidence of dementia disease pathology at death.

We are faced with a philosophical dilemma of what a diagnosis of a dementia disease actually means. Is it the risk-bearing pathology lurking silently in our midlife brains, or is it the end-of-life onset of a devastating decline in our everyday functioning? Is it both?

And who should decide between these two equally inadequate alternatives? Answers to these questions prompt different but profound ramifications for society because they ultimately determine how most of us die.

Option A: Diagnosis by biology

Scientifically and medically, it makes sense that a diagnosis focuses on detection of early disease markers so that the pathology of toxic proteins can be removed in good time before dementia symptoms emerge. This is the perspective on treating cancer.

Several clinical trials in the past two years have shown promise of drugs and non-invasive brain stimulating technologies capable of slowing down or even halting these toxic Alzheimer’s proteins. The era of treating dementia diseases no longer appears very far away, according to the US based Alzheimer’s Association working group who are leading the call for pathology-based diagnosis criteria.

A trillion-dollar question is whether this type of proactive diagnosis makes sense for societies and individuals. And it is worth trillions. The annual UK cost of dementia has risen over 60% since 2010 to £25 billion in 2021, and this doesn’t account for the estimated 1.1 billion hours of unpaid support that (mostly female) family members and caregivers contribute.

Health systems could benefit from significantly reduced dementia care costs if dementia diseases are proactively managed, and in parallel the drug development industry will gain immensely from a blanketed population roll-out of blood tests and therapies (notably, as a New York Times article reported, around two thirds of the Alzheimer’s Association working group receive payments by industry sources). To capture all potential dementia cases, every single person around the age of 50 will need to have a pathology blood test screening and receive appropriate treatment.

Such an approach could theoretically eradicate dementia, but there is a steep cost. We risk over-medicalising a large proportion of society that will be treated for a far-off condition they may never develop. Let’s also not forget the serious side effects that come with currently available drug treatments: brain swelling, bleeds, and death.

And if treatment efficacy is only a little less than 100%, millions of middle-aged people could be left with a dementia disease diagnosis and no certainty of whether they will even experience any symptoms or decline. Beyond the personal turmoil of this waiting game, just imagine the employment discrimination, driving licence, and insurance policy crises that would follow.

Option B: Diagnosis by symptoms

The alternative reactive diagnosis approach – namely, waiting for people to present to their primary care doctor with emerging symptoms for a confirmatory cognitive assessment and blood test – does not offer a superior alternative.

This approach is the principle behind today’s model of the dementia diagnosis (though blood tests are not currently in use, they are under review and will hopefully replace far lengthier brain scan tests).

Although this approach eliminates the risk of overdiagnosis, it means that the 20-year window of opportunity for treatment is lost and as current statistics show, it is a suboptimal way of approaching dementia diagnosis in today’s healthcare systems. The current diagnosis rate in the UK is 65%. That means the remaining 35% of cases, amounting to nearly half a million older adults, are unknowingly living with dementia - which goes a long way to explain the annual one billion unpaid care hours.

There are multiple system-level factors contributing to this lag, including long waiting times and insufficient scanner and specialist capacity (the UK has the lowest number of MRI and PET scanners per million people of all G7 countries, and it would require a 10-year investment of £14 billion to reach just the G7 average) [source: Mattke et al., Journal of Prevention of Alzheimer’s Disease, 2024].

But one of the biggest factors is much more inherent to the reactive approach. To even begin the diagnosis journey, clinicians rely on people to recognise signs and act on them – with very little education and awareness on what those signs are, and for a condition that comes with immense fear and stigma. With the reinforced need to now move towards earlier diagnosis, how early should we be telling people to look out for signs?

As my research into the design of dementia health systems is beginning to show, the fundamental barrier to a reactive diagnosis succeeding is not what happens inside the clinic but getting people to the clinic in the first place – and early enough to make the difference.

A new mindset is required

In my view, the dementia problem boils down to an issue of mindset. If we can’t afford to wait until it’s too late under a reactive diagnostic approach, but we also can’t be too proactive because we risk overmedicalising, then we might be asking the wrong questions.

Rather than conceive of a solution focused on the complex problem of early detection and intervention, we should attempt approaching from a different angle: the problem of lifelong awareness and protection, for which we are all responsible.

Dementia is not an inevitable part of normal ageing and shouldn’t be ignored until it’s too late. As I’ll go on to explore in Part 2 of this piece, more than a third of dementia cases might relate to certain lifestyle factors and the detrimental effects of our modern environments, which impact our brain health in unfathomable ways.

Hope for this new mindset comes from studies into dietary changes, vaccines that can prevent the build-up of disease pathology, cold water exposure therapy, the advent of wearable technologies, and from a rare familial kindred of remote Columbian mountain communities, where a quiet revolution in the discovery of protective genes against dementia is unfolding.

Our biggest killer could well be one that we can protect ourselves against. The real question to be answered, then, is how we put this new mindset into practice.


Coco Newton

Coco is a neuroscientist and entrepreneur driven by a vision to make dementia preventable. Since completing a PhD focused on improving the sensitivity of dementia diagnostic assessments, she has pivoted as a Schmidt Science Fellow into health systems design and engineering to better understand the broader complexity and human perspective of dementia. Alongside research positions at Cambridge, UCL, and TU Delft in the Netherlands, Coco is currently a King's E-Lab Research Associate where she is building a brain health start-up.

Previous
Previous

Unveiling the Journey from Scientist to Entrepreneur: An Inspiring Dinner with Nobel Laureate Sir Gregory Winter

Next
Next

Wonder and Art – In Conversation with Tim Yip