Category Archives: Food & Diet

Scientists Re-create 170-year-old Beer


Back in 2010, some archaeologists investigating a 19th century Baltic Sea shipwreck found something more unusual than treasure in the ship’s cargo – four beer bottles fully intact, with the brew still sealed inside. The amber ale was likely brewed in Belgium back in the 1840s, and was on its way to ports in Scandinavia.

You might wonder how well it held up, but surprisingly not too badly for being nearly two centuries old. “These bacteria were still alive,” said Brian Gibson, a senior scientist from the VTT Technical Research Centre in Espoo, Finland, not far from where the bottles were discovered. While beer has been around for at least 7,000 years, being brewed by the ancient Mesopotamians in Iraq, and many breweries have worked to recreate beers from the Middle Ages and American colonial era, Gibson believes this batch is likely the oldest bottle of beer in the world that’s still intact.

Gibson and his colleagues from the University of Munich did an in-depth chemical and microbiological analysis of the beer recently, publishing their work this week in the Journal of Agriculture and Food Chemistry. Despite the inevitable contamination from salt water, they were able to learn quite a bit about the processes of mid-19th century brewing.

“We have a reasonably good idea about what kind of hops were used, different ones than today,” said Gibson. “These hops would have been harsher, these days they are quite mild. The one surprising thing is the beers were quite mild. The original alcohol level was 4.5 percent, nothing extreme.”

Shortly after their retrieval from under the sea, the discovery was celebrated with a monumental beer tasting, consisting of beer experts throughout Finland who came to sample the 170-year-old brew. Rather than using a novelty talking bottle opener, they inserted a thin needle through the cork, taking their samples from two different bottles, in order to avoid exposing the contents to open air. However, the taste testing ended up being something of a disappointment in the end. The researchers described the ancient beer’s scent fairly vividly in their paper, as a cross between “autolyzed yeast, dimethyl sulfide, Bakelite, burnt rubber, over-ripe cheese, and goat with phenolic and sulfery notes.” During its time under the sea, water leaked through the cork of the bottle, rendering the contents about 30 percent salt water.

Despite how good it looked, the beer was considerably degraded. Like modern beers, this beer had a shelf life – a sell by date that had long since come and gone. Aside from the taste of sea water, the tasters had another issue. According to Gibson: “For the analysis, it was difficult to pick out the original flavors. We invited some of the most experienced beer tasters in Finland. The flavors were from bacterial contamination and not the original flavors of the beer.”

Therefore, Gibson and his team had to rely on a further chemical analysis to be taken on the sugars that remained, as well as the alcoholic compounds in order to get a better idea of how the beer was made – their primary interest being the practice of pre-Industrial distilleries.

“We looked at esters, which give beer a fruity or flowery taste. Most of the compounds that we would expect were there. In terms of the fruitiness, probably similar to modern beers. High level of 2-phenyl ethanol which gives a rose or floral aroma.”

In comparison to modern day craft brews, Gibson said their batch was similar to an amber or lambic style ale, which are normally brewed with wild hops. One of the beers had a fairly pronounced hops flavor, while the other likely had more of a fruit flavor, similar to modern summer beers. In many ways, the ingredients in the beers were fairly similar to modern ones, although it was likely that 19th century beer was much more sour, as they did not have a way of keeping acid-producing bacteria from the brew during fermentation.

Sam Calagione, who is the founder and president of Dogfish Head brewery in Milton, Del., has already shown great intrigue in their finds, as his company has worked to recreate historic beers since 1998 with recipes obtained from archaeological digs.

Dogfish’s “Midas Touch,” named for the fabled Greek king, was based on a jar found in a 2,700-year-old tomb uncovered in Turkey, a Bronze Age drink made from barley, saffron and white muscat grapes.

“The whole idea of looking backward for creative inspiration and culinary adventure is really catching on,” Calagione said. “All (the scientists) can give us is a laundry list of ingredients. It is up to us to come up with a creative recipe. What the alcohol content is, whether it’s filtered or carbonated. We have a lot of creative input in bringing these creative beers back to life.”

Stallhagen Brewery of the Aaland Islands in Finland has recently imitated the Baltic Sea beer, under the label “1843.” In addition to the beer bottles, the divers also found 150 bottles of champagne in the wreckage.

James Sullivan
James Sullivan is the assistant editor of Brain World Magazine and a contributor to Truth Is Cool and OMNI Reboot. He can usually be found on TVTropes or RationalWiki when not exploiting life and science stories for another blog article.

More than one good reason for eating mainly plant foods


Meat contains some important nutrients, but it’s not essential for a healthy diet. Many people, especially men in Western countries are, on average, eating too much of it. Despite vested interests that wish to maintain this status quo, which prevails in most Western countries, there are very good reasons to curb your meat consumption.

A large body of evidence suggests vegetarians enjoy lower rates of cardiovascular disease, type 2 diabetes and hypertension. A 2013 study of over 70,000 individuals in the United States found a 12% reduction in premature death for vegetarians and studies of healthy long-lived populations all show modest consumption of red meats.

Clearly, advice to favour plant-based foods and reduce meat intake should now be considered part of healthy dietary advice given by doctors and nutritionists. Especially because myths that a vegetarian diet leads to inadequate levels or iron or protein have been dispelled. But a recent report by a US nutrition advisory committee that suggests exactly this has come under fire.

Under attack

The scientific report of the 2015 dietary guidelines advisory committee will form the basis of the latest US dietary guidelines, which will aim to curtail the growing national prevalence of lifestyle diseases.

Meat contains some important nutrients, but it’s not essential for a healthy diet.
Mike/Flickr, CC BY-NC-SA

Half of American adults have one or more preventable chronic diseases and over two-thirds of the adult population and one-third of children are overweight or obese. (Australians are not far behind with over 60% of adults and 25% of children overweight or obese.)

The US report has been in the news for its recommendations to scrap restrictions on eggs and the advice to limit red meat as well as refined grains and sugary foods and drinks.

Australian dietary guidelines have never restricted eggs, but the most recent set, released in 2013, made similar recommendations about avoiding large quantities of red meat – especially for men. Indeed, our guidelines have emphasised the need for more plant foods, including vegetables, legumes, nuts, seeds and grains (as wholegrains rather than refined grain products) since the first set was released in 1981.

But the US committee has faced quite strident criticism from the North American Meat Institute and other meat industry groups for its focus on diet’s impact on human health and the environment. The report states:

The major findings regarding sustainable diets were that a diet higher in plant-based foods, such as vegetables, fruits, whole grains, legumes, nuts and seeds, and lower in calories and animal-based foods is more health promoting and is associated with less environmental impact than is the current US diet.

Greenhouse gas emissions from the livestock sector, mostly methane and nitrous oxide, are estimated to account for 14.5% of the global total.
U.S. Department of Agriculture/Flickr, CC BY

Meat and the environment

According to the report, following its suggestions would lead to:

lower greenhouse gas emissions and more favorable land, water and energy use than are current US dietary patterns.

It’s not alone is highlighting the impact of meat consumption on climate change. Greenhouse gas emissions from the livestock sector, mostly methane and nitrous oxide, are estimated to account for 14.5% of the global total. This is more than direct emissions from the transport sector.

And a report from UK think tank Chatham House released late last year recognises modification of meat consumption as one of the strategies to reduce the extent of climate change.

But many people enjoy eating meat and don’t wish to adopt a vegetarian diet. For them, the good news is modest meat intake is compatible with both health and environmental benefits. Having at least some main meals with less meat and more legumes, nuts, seeds and vegetables will be good for not only your health, but also the environment.

The Conversation

This article was originally published on The Conversation.
Read the original article.

Why shade-grown coffee is good for birds and farmers


By Evan R. Buechley, University of Utah

Your choice of coffee can make a difference for birds in tropical parts of the world — and biodiversity overall. In a study on coffee plantations in Africa, we found that coffee farms with shade trees are best for birds and that these tropical birds likely provide important environmental and economic benefits to farmers.

Earnings related to the coffee trade were an estimated US$173 billion in 2012, making coffee one of the most valuable commodities. And for many tropical countries, it is the largest export. Coffee is grown on 24.8 million acres, mainly in tropical forest ecosystems, some of the most biologically rich terrestrial ecosystem on Earth.

Agriculture makes up 38% of global land cover, which means farms are critical areas for wildlife conservation. Agroforestry – a technique that combines crops with a mixture of trees and shrubs – is particularly important for biodiversity conservation. Shade coffee farming, where the crop is grown under a tree canopy, is one of the most biodiversity-friendly agricultural habitats and harbors high bird diversity.

Unfortunately, certified sustainable coffee is only about 8% of the global coffee market. The vast majority of coffee is produced in monoculture farms with few or no shade trees, which harbor minimal biodiversity and are a cause of rainforest deforestation. Additionally, intensive-sun coffee farms can face pollination and pest problems, increasing reliance on pesticides and further perpetuating ecological degradation.

Going to the source: Ethiopia

We monitor birds because they reflect the overall health of ecosystems. Since they are specialized, have key ecological functions and are susceptible to disturbances, their declines can affect ecological processes, including insect regulation, seed dispersal, and pollination.

Forest birds are declining around the world, primarily because deforestation is destroying their habitat. Currently, 14% of the world’s bird species are threatened or near threatened with extinction and most of these birds live in tropical forests.

Coffee shrubs grow beneath the tree canopy in Ethiopia – the original habitat for coffee cultivation.
Evan R. Buechley, Author provided

Coffea arabica makes up two-thirds of the world’s coffee market and is native to southwestern Ethiopia, where it has been cultivated for over 1,000 years. The agricultural industry accounts for 80% of employment in Ethiopia and coffee is the primary export crop. Here, there are different types of coffee cultivation, including near-wild coffee grown in forests, shade coffee farms with native tree canopies and monoculture sun coffee plantations. Although Ethiopia has a long history of shade coffee farming, it is following the global trend towards sun coffee monoculture.

Over a three-year period, we studied bird communities on shade coffee farms and nearby forests in southwestern Ethiopia, where C. arabica was first domesticated from wild stock. We set out to evaluate which and how many bird species were present on shade coffee farms, in comparison to nearby forests.

Lead author on the paper, Evan Buechley (left) gives a bird banding demonstration to Ethiopian school children, with the help of Girma Ayalew (right) of the Ethiopia Wildlife Conservation Authority. Outreach and capacity building through hands-on training and job creation are key aspects of our research in Ethiopia.
Evan R. Buechley, Author provided

We sampled the bird communities by using fine nets strung between bamboo poles. When birds fly into the nets, they fall gently into one of several pockets. Within 30 minutes we remove, identify, measure, tag and release them unharmed. Using this technique, we were able to evaluate species richness, diversity and bird community structure.

Our results showed that shade coffee farms had more than twice as many bird species as forest sites and all but one of nine migratory species were captured only in shade coffee habitat. Furthermore, all species that were captured in nearby forests were also captured in shade coffee sites, where we also found evidence that several threatened, forest-dwelling specialist birds were likely breeding.

Forest sites did have a much higher relative abundance of forest specialist species. Nonetheless, our study documents the only known location where all forest understory bird species recorded in nearby forest were also recorded on shade coffee farms. These results position Ethiopian shade coffee as likely the most “bird-friendly” in the world.

Ecological services

Retaining shade cover on coffee farms helps to preserve insect- and nectar-eating birds. In turn, these species provide important ecosystem services –- that is, free services provided to humans by controlling insect pests and pollinating crops.

A Tacazze sunbird is a nectar-eating bird in Ethiopian forests, one of the birds that can live and help sustain healthy habitat on shade-grown coffee farms.
Evan R. Buechley, Author provided

For example, a study in Jamaica concluded that insect-eating birds benefited coffee farmers by $125 per acre per year by controlling pests. Our results show that shade coffee farms in Ethiopia harbor a diverse and abundant insect-eating bird community that may provide similar ecosystem services.

With a per-capita GDP of $505, Ethiopia is one of the most impoverished nations on Earth. Such ecosystem services could vastly improve the livelihoods of small-scale farmers.

More importantly, certifying, publicizing and marketing Ethiopian coffee as “shade-grown” and “bird friendly” has the potential to increase the incomes of local coffee farmers, providing a disincentive to convert shade coffee farms into sun coffee plantations. Farms in Ethiopia with shade grown certification may receive as much as 20% more revenue per unit of crop.

So for your next cup, look for “shade grown” and “bird friendly” coffee, certified by the Rainforest Alliance or Smithsonian Migratory Bird Center and encourage these organizations to certify farms in Ethiopia – the birthplace of coffee and likely the most biodiversity-friendly.

The Conversation

This article was originally published on The Conversation.
Read the original article.

How Hunter-Gatherers Brought Grain to Great Britain 7,600 Years Ago


We like to imagine that our distant Ice Age ancestors were hunterers by instinct, fearlessly navigating an unforgiving world, in the face of an extreme climate and unimaginable danger – nomads without a home, following only the stars and the herds through the brush. Whether it sounds romanticized or barbaric, the real picture could be very different, thanks to some new evidence brought to light by British archaeologists and published in Science this week.

While the diet of the average hunter-gatherer seems largely unpalatable today – living and dying in an age without most of the produce we take for granted, and which has also contributed to obesity epidemics, they didn’t always live off the land in the strictest sense of the term – actively trading with other tribes and even importing grains which they introduced to the British Isles from the European continent. In fact, if the ancient DNA discovered just below the British coast is any indication, it is likely that farming may have been happening in Great Britain 2,000 years earlier than what researchers once thought. Beneath the rocks of the windy coast lie the remains of what was once a prehistoric hunting camp.

“The work may be forcing archaeologists to confront the challenge of fitting this into our worldview,” said Dorian Fuller, an archaeobotanist at University College London who did not partake in the research. It may show that the evolution of agriculture was much more of a gradual and complex process than archaeologists had previously thought, affecting the transition of each adapting tribe in different ways.

For decades, the classic model held by archaeologists was that some of the earliest farmers traveled through the Middle East into Europe, first migrating some 10,500 years ago. Upon their arrival, they either began to replace or successfully converted these hunter-gatherer populations when they continued to move westward. They finally reached the British Isles somewhere around 6,000 years ago, around the time that Mesopotamia first discovered the properties of beer making. However, this worldview has undergone a number of modifications in the last several years. Back in 2013, some excavations of dwellings show that farmers and hunter-gatherers co-existed for a substantial period of time, during which they my have developed their own barter system for services, rather than everyone readily embracing the new concept of agriculture. A 2013 archaeological dig in Germany revealed that both farmers and hunter-gatherers shared a cave for burying their dead, a practice they continued for over 800 years, indicating that these groups often lived closely together on overlapping tracts of land. Another more controversial find, still not fully tested, is the claim that some hunter-gatherers living in the Baltic region of Europe 6,500 years ago may have actually eaten domesticated swine, which they were given by local farmers.

Rather than moving from east to west, this rash of new excavations could mean that these people traveled much more erratically and extensively than we once thought. Robin Allaby, a plant geneticist at the University of Warwick in the United Kingdom, led the expedition, initially in a search for evidence of the oldest domesticated plants in Great Britain, a land which was settled by people relatively later than the rest of Europe. On their travels, the researchers decided to explore an already known underwater site, the Bouldnor Cliff, about 820 feet off Britain’s southern coast in the English Channel.

Bouldnor Cliff, located 36 feet below the water’s surface, was first described as fossil rich in 1999, drawing the curiosity of researchers everywhere after the United Kingdom’s Maritime Archaeology Trust recalled, “a lobster seen throwing Stone Age worked flints from its burrow.” Archaeologists haven’t left it alone since. The hunter-gatherers who camped near the site, who we might think of as land dwellers, are suspected to have sailed wooden boats built from trees near the coast. Allaby’s team discovered some burnt hazelnut shells in the sediments, which radiocarbon dating and ancient DNA analysis revealed to be between 8,020 to 7,980 years ago, before the sea levels rose that separated Britain from France.

When comparing these DNA samples, the team was in for another pleasant surprise. They were able to isolate from the DNA samples two different kinds of domesticated wheat – one of which was of Middle Eastern origin, with no ancestors living in the wilderness of northern Europe. Therefore, the nomads who camped out on Bouldnor Cliff were somehow associated with the beginning of agriculture throughout the Middle East, which began some 10,500 years ago.

So did they farm their own wheat somewhere near the encampment? No traces of pollen were detected in further analyses, which would have implied that the plants were actually grown and underwent a flowering process in prehistoric Britain. They also ruled out any possibility that their sample was contaminated with modern grasses. If this is consistent with other findings, it is likely that farming may have begun as early as 7600 years ago, spreading to Britain from France.

However, there is the other possibility that hunter-gatherers from Britain may have gone deeper into the heart of Europe than researchers have proposed, and actually picked up products from farmers living eastward, which they then brought back to their camp. Allaby has agreed that the usage and frequency of grains in Britain at this time period is still disputable – it might have been seen as more of a rare commodity like exotic spices, than a staple of daily diets.

James Sullivan
James Sullivan is the assistant editor of Brain World Magazine and a contributor to Truth Is Cool and OMNI Reboot. He can usually be found on TVTropes or RationalWiki when not exploiting life and science stories for another blog article.

For safety’s sake, make food labels say what companies already know


By Melanie Voevodin, Monash University

Prime Minister Tony Abbott has called on two senior ministers to prepare a cabinet submission on country-of-origin labelling laws. The move follows a national outbreak of hepatitis A linked to frozen berries from China and Chile.

The outbreak was a strong reminder that all is not well in Australia’s food supply. Once the alleged offending ingredient was identified and relevant products recalled, consumers claimed they were not aware the berries they were choosing to eat were from China.

But labelling on the berry products complied with current labelling and consumer information laws. And despite the recall highlighting the inadequacy of the labelling, the prime minister dismissed initial calls for changes. He said it would make life very hard for business, would raise the cost of food and that it was the responsibility of business “not to poison their customers”.

That changed this morning when Abbott asked Agriculture Minister Barnaby Joyce and Industry Minister Ian MacFarlane to submit a proposal to cabinet in March. MacFarlane has already warned consumers may have to bear the cost of the change.

Here’s one problem with this current food-labelling system: “Made in Australia from local and imported ingredients” does not actually reveal where the food comes from. A company can claim a product is made in Australia if at least half the cost of manufacturing that product is incurred here.

Consider a jar of jam: the total cost of production includes the cost of producing the lid, the jar, the label, as well as the jam. Half the cost of production could easily be attributed to the jar itself, leaving room for jam ingredients to be imported and still allowing the label to say it was made in Australia.

The 2011 report on the effectiveness of Australia’s food-labelling system described the challenges for improving transparency. It identified country-of-origin labelling as a particularly contentious issue, and recommended:

That for foods bearing some form of Australian claim, a consumer-friendly, food-specific country-of-origin labelling framework, based primarily on the ingoing weight of the ingredients and components (excluding water), be developed.

Other options

This recommendation was taken up by Greens leader Christine Milne. She introduced a bill to improve transparency of country-of-origin labelling just before the berry scare. It calls for three items on labels that cover where a product is grown, where it’s manufactured and where it is packaged.

Even before this, consumer organisation Choice launched a campaign about country-of-origin labelling in January 2012 after a survey showed 86% of respondents found such labels unclear.

Choice proposes a three-tiered system that specifies “product of” for primary produce such as fruit and vegetables, “manufactured in” and “packaged in”. This last one would cover foods with input from multiple companies, which makes it difficult to isolate single ingredients, and products such as mixed frozen vegetables where each vegetable is from a different country. Choice plans to develop exact wording through consumer testing.

Another simple and practical way to resolve the problem is to include the origin of imported ingredients in the “ingredient list”. Labelling laws mandate that ingredient lists appear on every product. Individual ingredients are listed in order of volume, from most to least.

Take peanut butter as an example. Its ingredients list says roasted peanuts, vegetable oil, sugar and salt. If labelling laws mandated the listing of country of origin for imported ingredients, the list might say roasted peanuts (China), vegetable oil (Chile), sugar (Phillipines) and salt.

Ingredients sourced locally wouldn’t need to be declared as made in Australia would the default position; only imported ingredients would need to state the country of origin.

It’s about time

Milne’s bill has attracted the ire of the Australian Food and Grocery Council, the peak food, drink and grocery manufacturing body. And food manufacturers have already responded to the prime minister’s announcement. They say changing the labelling system would place an unreasonable burden on them.

But changing the wording of a label is different from adding a regime of increased testing and reporting. And although risk assessment and testing of imported foods is vital, what we now need for consumer confidence is the more cost-effective option of label change.

Food companies track the exact point of origin of each ingredient because of quality-control procedures, supported by Australia’s food laws. All consumers want is for the companies to tell them what they already know.

Changing Australia’s country-of-origin labelling system will effectively give consumers power to make informed decisions in the free market. And it will overcome the current information asymmetry, which keeps them in the dark.

The Conversation

This article was originally published on The Conversation.
Read the original article.

Berry scare: here’s what you need to know about hepatitis A


By Peter White

School students in Victoria, South Australia, and Queensland are the latest group thought to have eaten frozen berries linked to hepatitis A, which has now infected 14 people nationally. All food-borne disease outbreaks are frightening, but the good news for this one is that hepatitis A is rarely life-threatening.

Hepatitis is inflammation of the liver and can result from a viral infection by one of five hepatitis viruses, which are called hepatitis A, B, C, D and E viruses. The viruses are unrelated to each other but are joined in name because they all replicate in the liver.

Hepatitis A and E viruses are probably the least serious, although infection is still harmful. On average, the blood-borne hepatitis viruses, namely B, C and D are nastier, mainly because all three can evade the immune system and cause a life-long infections. This can result in continual liver damage over decades.

Hepatitis A infection

When someone is infected by the hepatitis A virus they won’t become ill until two to seven weeks later. And they’re infectious around two weeks before and after symptoms.

The infected person will usually experience general “hepatitis” symptoms (tired, feeling sick, muscle aches, fever) for up to two weeks before they start to appear yellowish or jaundiced, as a result of liver damage. Jaundice lasts for between one and three weeks.

Infection could lead to the liver being enlarged so the upper right-hand side of the body may feel tender. Even after the jaundice has faded, hepatitis A infection can leave a patient feeling tired, off their food and weak for some weeks, or even months.

The idea of all viruses is to replicate to enormous numbers so they can be transmitted to new hosts. Hepatitis is no different, and about ten billion hepatitis A viruses are excreted per gram of feces. That’s why washing your hands is so important once you’re infected. It only takes about 1,000 viruses to infect someone – you can do the maths.

The hepatitis A virus is very common in developing countries, where human fecal waste often contaminates food and water supplies because sanitation standards are much lower. But in developed countries, such as Australia, the United States and the United Kingdom, infection with hepatitis A is quite rare; few people encounter the virus unless they travel overseas. This disparity in infection rates between wealthy and poor countries highlights the need for, and should strengthen efforts to, generate clean water supplies.

Like most viral infections, adults fare worse than children. A third of infected adults will get sick from hepatitis A, so the disease tends to be nastier in Western countries where mainly adult infections occur.

Only hepatitis A virus and hepatitis E virus are transmitted through the consumption of food and water contaminated with human feces. The other three are transmitted sexually (hepatitis B) or transmitted through virus contaminated blood (hepatitis B, C, and D).

Other hepatitis viruses

Infection with any of the five hepatitis viruses generally causes similar clinical symptoms, which include fever, headache, feeling weak, muscle aches, and loss of appetite, among other things. But infection with the hepatitis A virus is different from hepatitis B, C and D viruses because it seldom results illness lasting for more than a couple of months. In the vast majority of cases, the virus is eliminated from the body and the liver totally recovers.

The story is however different for hepatitis B, C ands D viruses, which can cause lifelong (chronic) infections in some infected people. A chronic infection is when the virus remains permanently in the liver because it cannot be removed by the immune system.

The immune system continually attacks the liver cells where the virus hides, resulting in their destruction and ultimately a reduction in the capacity of the liver to do its many tasks. This means liver damage continues over ten to 20 years and can lead to serious liver disease and liver cancer. This happens in about a quarter of those infected with these viruses.

Along with hepatitis A, hepatitis E also does not cause chronic infection. Having said that, hepatitis E is very dangerous to pregnant women and can be fatal to both mother and offspring.

Vaccine and treatment

Vaccines can protect against hepatitis A and hepatitis B infection, but they don’t cure infection. Since 1988, all Australian babies get hepatitis B vaccinations. But as the hepatitis A virus is rare in developed countries and not life threatening, vaccination is largely offered to people travelling to places where the virus is endemic. This makes very good practical sense and it’s cost effective

You should have the vaccine about a month before you travel because it takes between two and four weeks for full immune protection, which lasts about 25 years. Just like mumps and measles, once you’ve had a hepatitis A infection or the vaccine, you’ll probably be immune for life.

Our main weapon against hepatitis A virus are immunoglobulins, antibodies that bind to the virus to prevent further infection of cells. It needs to be administered two weeks after exposure to be effective. Other than this there are no real antiviral treatment options for hepatitis A. People with the virus are clinically supported with management of complications if they arise.

Treatments can suppress hepatitis B virus replication, but stopping it will allow the virus to come back. Only hepatitis C has a treatment that can cure it and these drugs are very new and very expensive.

The Conversation

This article was originally published on The Conversation.
Read the original article.

How a simple vitamin B prescription could help people with Alzheimer’s


By Peter Garrard, St George's, University of London

Age strikingly increases the risk of dementia, which affects around one in a hundred people aged between 65 and 69, but one in six of those aged 80 and over. As the progression of dementia takes place over many years and is not susceptible to medical treatment, the high costs of dementia are mostly those of providing long term care. The health economic impact of dementia exceeds that of cancer, heart disease and stroke combined.

The majority of cases of late life dementia are caused by the neurodegenerative condition known as Alzheimer’s disease. Pathologists recognise two characteristic changes in the brains of people who have died with this disease: abnormal protein aggregates deposited between nerve cells (called amyloid plaques) and bundles of chemically-altered protein filaments that destroy neurons from the inside (neurofibrillary tangles).

It is assumed that drugs with actions that modify one or both of these abnormal processes will give rise to some change in the clinical course of the disease. Yet after more than a decade, the race to find the first “disease modifying agent” for Alzheimer’s has still not been won.

Disease progression

One obstacle has been the slow evolution of the disease. Like other biological organs, the brain contains more nerve cells than are needed for survival. This redundancy is advantageous, as it can allow continuation of function following damage – due to a stroke or head injury, for example. However, when the damage in question is a disease that encroaches slowly and destroys the brain one unit at a time, compensatory activity that begins to happen in parallel obscures behavioural changes in a person until the disease is so advanced that the brain is beyond repair.

There are solutions to this impasse. One of the most successful has been the introduction of the term “mild cognitive impairment” to capture a group of people whose cognitive abilities have changed in later life, but who remain independent in day-to-day activities. Among people with mild cognitive impairment will be a proportion whose symptoms truly represent the earliest stages of Alzheimer’s disease. Others will find their problems remain stable, which reflects the changing cognitive profile of normal ageing. And some others will even experience an improvement.

Brain differences.
Hey Paul Studios, CC BY

A simple and reliable (and preferably inexpensive) laboratory test to identify the subgroup destined to progress would herald a new era of disease-modifying drug discovery and large-scale trials. Unfortunately, the best of the disease biomarkers that currently available are cumbersome, expensive and only partially specific. Amyloid imaging and spinal fluid analysis will correctly detect more than 90% of Alzheimer’s, though positive results also occur in as many as 50% of people without the condition.

Prevention because no cure

An alternative strategy for reducing the burden of Alzheimer’s disease is one of prevention, based on identifying and reducing exposure to its risk factors. In its most common forms, the condition has multiple antecedents, which include both genetic predispositions and critical environmental differences, including traumatic brain injury and educational attainment (so legislating for cycle helmets and improving the quality of state-funded education may have important long-term as well as more immediate benefits to society).

There is also evidence that raised blood levels of the amino acid homocysteine constitute an important independent risk factor for dementia. Some individuals are born with high concentrations of homocysteine, which damages the linings of blood vessels, causing strokes and heart attacks in people in their twenties and thirties. Ageing brings about a gradual elevation, and studies of ageing populations have shown that higher homocysteine levels significantly increase an individual’s risk of developing dementia.

Recent studies associating raised homocysteine with higher rates of shrinkage on brain scans, and greater neurofibrillary tangle burden at post-mortem, have strengthened the evidence for a biological connection between homocysteine and Alzheimer’s pathology.

CHConOP or B12 for short.
B12 by Shutterstock

Homocysteine is not taken in the diet, so this enhanced risk of dementia cannot be mitigated by lifestyle changes. It is, however, critically dependent on the status in the body of B vitamins, which promote its conversion to non-toxic and biologically useful chemicals. Low levels of vitamin B12 and folic acid thus lead to higher concentrations of homocysteine, while regular dietary supplementation effects a return to normal levels.

Prescribing vitamin B

So, might the economic burden and individual suffering associated with Alzheimer’s disease be reduced by the simple and inexpensive expedient of prescribing B-vitamins to those with high levels of homocysteine?

The VITACOG trial, a preliminary clinical trial in subjects with high plasma homocysteine levels, showed that the brains of those who received B-vitamins shrank significantly less rapidly than those of the placebo group, particularly in areas that are associated with early pathological changes in Alzheimer’s.

Such a striking result seemed to indicate the need for a nationwide trial to test whether the outcome would translate into a clinically important disease-modifying effect on the rate of progression in mild cognitive impairment.

The arguments for conducting the trial were overwhelming and, with the assistance of a national network of experts in dementia and clinical trials, I prepared the scientific and economic case for funding. Opposition to the idea, however, appeared from an unexpected quarter – a meta-analysis of cognitive outcome data taken from completed trials of B vitamins for stroke and heart attack prevention. Somehow, a statistical miscellany of recycled results was rapidly elevated to a status little short of definitive scientific proof.

Meta-analysis can be a powerful way of drawing robust conclusions from the results of an experiment that has been conducted multiple times on small populations. In the B vitamin case, the numbers included in the meta-analysis were impressive. Yet numbers mean nothing if the data is neither uniform nor directly relevant to the question. Closer scrutiny revealed that few of the trials focused on dementia, that the ages of patients who took part were well below those associated with the development of Alzheimer’s disease, and that inclusion did not require the presence of mild cognitive impairment. And astonishingly, the results from the VITACOG trial were not included.

Unsurprisingly, the outcome of the pooled analysis was anodyne: neither the treatment nor the placebo group showed any meaningful change on any measure of cognitive status during follow-up. In other words – to quote one of the members of the original VITACOG study team – the analysis merely demonstrated that: “taking B vitamins won’t prevent cognitive decline in those who overall, do not show cognitive decline anyway.”

Yet the absence of any difference between the two treatment arms has been erroneously, widely, and without qualification interpreted as evidence against the benefits of B vitamins in Alzheimer’s disease.

In recently published letters to the journal, I and many colleagues from around the world have pointed out the flaws in the meta-analysis and its harmful misinterpretation: harmful for medical research, for the cause of dementia prevention and, most of all, for the thousands of individuals who could have benefited from a safe and simple intervention.

We remain convinced that the clinical benefits of B vitamins in groups at high risk of Alzheimer’s disease should be allowed to be rigorously tested, but the damage caused by the public misinterpretation of a null study will take a while to unpick.

The Conversation

This article was originally published on The Conversation.
Read the original article.