By Peter Garrard, St George's, University of London
Age strikingly increases the risk of dementia, which affects around one in a hundred people aged between 65 and 69, but one in six of those aged 80 and over. As the progression of dementia takes place over many years and is not susceptible to medical treatment, the high costs of dementia are mostly those of providing long term care. The health economic impact of dementia exceeds that of cancer, heart disease and stroke combined.
The majority of cases of late life dementia are caused by the neurodegenerative condition known as Alzheimer’s disease. Pathologists recognise two characteristic changes in the brains of people who have died with this disease: abnormal protein aggregates deposited between nerve cells (called amyloid plaques) and bundles of chemically-altered protein filaments that destroy neurons from the inside (neurofibrillary tangles).
It is assumed that drugs with actions that modify one or both of these abnormal processes will give rise to some change in the clinical course of the disease. Yet after more than a decade, the race to find the first “disease modifying agent” for Alzheimer’s has still not been won.
One obstacle has been the slow evolution of the disease. Like other biological organs, the brain contains more nerve cells than are needed for survival. This redundancy is advantageous, as it can allow continuation of function following damage – due to a stroke or head injury, for example. However, when the damage in question is a disease that encroaches slowly and destroys the brain one unit at a time, compensatory activity that begins to happen in parallel obscures behavioural changes in a person until the disease is so advanced that the brain is beyond repair.
There are solutions to this impasse. One of the most successful has been the introduction of the term “mild cognitive impairment” to capture a group of people whose cognitive abilities have changed in later life, but who remain independent in day-to-day activities. Among people with mild cognitive impairment will be a proportion whose symptoms truly represent the earliest stages of Alzheimer’s disease. Others will find their problems remain stable, which reflects the changing cognitive profile of normal ageing. And some others will even experience an improvement.
A simple and reliable (and preferably inexpensive) laboratory test to identify the subgroup destined to progress would herald a new era of disease-modifying drug discovery and large-scale trials. Unfortunately, the best of the disease biomarkers that currently available are cumbersome, expensive and only partially specific. Amyloid imaging and spinal fluid analysis will correctly detect more than 90% of Alzheimer’s, though positive results also occur in as many as 50% of people without the condition.
Prevention because no cure
An alternative strategy for reducing the burden of Alzheimer’s disease is one of prevention, based on identifying and reducing exposure to its risk factors. In its most common forms, the condition has multiple antecedents, which include both genetic predispositions and critical environmental differences, including traumatic brain injury and educational attainment (so legislating for cycle helmets and improving the quality of state-funded education may have important long-term as well as more immediate benefits to society).
There is also evidence that raised blood levels of the amino acid homocysteine constitute an important independent risk factor for dementia. Some individuals are born with high concentrations of homocysteine, which damages the linings of blood vessels, causing strokes and heart attacks in people in their twenties and thirties. Ageing brings about a gradual elevation, and studies of ageing populations have shown that higher homocysteine levels significantly increase an individual’s risk of developing dementia.
Recent studies associating raised homocysteine with higher rates of shrinkage on brain scans, and greater neurofibrillary tangle burden at post-mortem, have strengthened the evidence for a biological connection between homocysteine and Alzheimer’s pathology.
Homocysteine is not taken in the diet, so this enhanced risk of dementia cannot be mitigated by lifestyle changes. It is, however, critically dependent on the status in the body of B vitamins, which promote its conversion to non-toxic and biologically useful chemicals. Low levels of vitamin B12 and folic acid thus lead to higher concentrations of homocysteine, while regular dietary supplementation effects a return to normal levels.
Prescribing vitamin B
So, might the economic burden and individual suffering associated with Alzheimer’s disease be reduced by the simple and inexpensive expedient of prescribing B-vitamins to those with high levels of homocysteine?
The VITACOG trial, a preliminary clinical trial in subjects with high plasma homocysteine levels, showed that the brains of those who received B-vitamins shrank significantly less rapidly than those of the placebo group, particularly in areas that are associated with early pathological changes in Alzheimer’s.
Such a striking result seemed to indicate the need for a nationwide trial to test whether the outcome would translate into a clinically important disease-modifying effect on the rate of progression in mild cognitive impairment.
The arguments for conducting the trial were overwhelming and, with the assistance of a national network of experts in dementia and clinical trials, I prepared the scientific and economic case for funding. Opposition to the idea, however, appeared from an unexpected quarter – a meta-analysis of cognitive outcome data taken from completed trials of B vitamins for stroke and heart attack prevention. Somehow, a statistical miscellany of recycled results was rapidly elevated to a status little short of definitive scientific proof.
Meta-analysis can be a powerful way of drawing robust conclusions from the results of an experiment that has been conducted multiple times on small populations. In the B vitamin case, the numbers included in the meta-analysis were impressive. Yet numbers mean nothing if the data is neither uniform nor directly relevant to the question. Closer scrutiny revealed that few of the trials focused on dementia, that the ages of patients who took part were well below those associated with the development of Alzheimer’s disease, and that inclusion did not require the presence of mild cognitive impairment. And astonishingly, the results from the VITACOG trial were not included.
Unsurprisingly, the outcome of the pooled analysis was anodyne: neither the treatment nor the placebo group showed any meaningful change on any measure of cognitive status during follow-up. In other words – to quote one of the members of the original VITACOG study team – the analysis merely demonstrated that: “taking B vitamins won’t prevent cognitive decline in those who overall, do not show cognitive decline anyway.”
Yet the absence of any difference between the two treatment arms has been erroneously, widely, and without qualification interpreted as evidence against the benefits of B vitamins in Alzheimer’s disease.
In recently published letters to the journal, I and many colleagues from around the world have pointed out the flaws in the meta-analysis and its harmful misinterpretation: harmful for medical research, for the cause of dementia prevention and, most of all, for the thousands of individuals who could have benefited from a safe and simple intervention.
We remain convinced that the clinical benefits of B vitamins in groups at high risk of Alzheimer’s disease should be allowed to be rigorously tested, but the damage caused by the public misinterpretation of a null study will take a while to unpick.
This article was originally published on The Conversation.
Read the original article.