By Patrick Holford

How often have you read this phrase from supposed experts, apparently described as “based on science”? It usually comes from “foodies” who believe that you can get all the nutrients you need from a well-balanced diet. I found an example of this in a recent book by Professor Tim Spector, who dismisses the need for vitamin C or vitamin D supplementation.
I will deal with what the science and relevant studies actually show, especially regarding supplements that are claimed to help prevent cognitive decline. But first, let’s look more deeply at the mindset behind such claims.
The idea that we can get all the nutrients we need from food makes intuitive sense. Underneath this lies the belief that we evolved to grow and survive using the nutrients available in food. Darwin reached a similar conclusion when he argued that “the conditions of existence” were the main driving force in evolution.
Extending this logic, consider the period of prehistory when hominid brain size grew steadily, culminating in Homo sapiens brain size (calculated from skull size) of almost 1,700 grams, circa 20,000 to 30,000 years ago. It has since shrunk by about 20% to today’s average brain size of less than 1,350 grams. So it is equally logical to ask: what has changed in our “conditions of existence” to result in our shrinking brains? If what we were eating over 20,000 years ago was closer to optimal, and what we eat now is often pathological, what are the main differences?
The first difference is the quantity of food. Today the average person expends roughly 200 to 400 calories a day on physical activity, compared to 600 to 1,200 calories per day for our ancestors; at least three times more than the average modern adult. They had to eat around three times as much as us, just to maintain weight. So even if we ate the same foods with the same nutrient density, we would still be more likely to fall short. But of course today’s food is often less nutrient-dense as well.
A simple, far less “prehistoric” illustration is a comparison of the diet of mid-Victorian workers. A study in the Journal of the Royal Society of Medicine found that the nutrient intake of a mid-Victorian worker, while not necessarily constituting an “optimal” diet, had a far higher intake of vitamins, minerals and essential fatty acids than we do today. They concluded that this “constitutes a persuasive argument for a more widespread use of food fortification and/or food supplements” to make up the difference. (1)
So which nutrients that were abundant in our ancestors’ diets are widely missing now? The starting point has to be marine food. Early humans had to migrate and live along the water’s edge for basic survival. Rivers, estuaries, swamplands and coasts would have provided a plentiful supply of marine foods, rich in both omega-3 and phospholipids such as choline, plus vitamin D, vitamin B12, selenium, zinc and iodine: all completely essential for brain development. Nutrient-dense foods such as molluscs, crustaceans and small fish caught in rock pools were also highly accessible. For gatherers, most likely women, these “fruits de mer” were rich pickings, and since brain development happens largely during pregnancy, maternal nutrition was especially crucial.
There is also the “aquatic” or “waterside ape” hypothesis promoted by brain researcher Professor Michael Crawford, and discussed widely in popular science. Support for this hypothesis includes an analysis of the diet of a 40,000-year-old Homo sapiens discovered in a cave on the coast of South Wales. Based on bone analysis, it has been estimated that at least 20% of this individual’s diet was marine food. Considering their much higher level of physical activity, roughly half of a modern diet would need to consist of marine food in order to achieve an intake of nutrients equivalent to that consumed during the period when the brain of Homo sapiens reached its largest size.
In short, it is hard to explain human brain evolution without abundant omega-3, choline and vitamin B12, alongside a high intake of folate from plants (previously all organic).
Omega-3 DHA, which is primarily found in marine foods, is a keystone nutrient for brain structure and function. Typical modern intake is often very low. The optimal intake for brain health and dementia prevention may be as much as 1,000 mg a day. The lowest risk of several diseases occurs with around 2,000 mg of omega-3 (both EPA and DHA). This kind of intake is entirely consistent with an ancestral diet rich in marine food, but it is not so easy in the average diet today without eating marine foods almost every day, which would also help provide enough choline.
The average intake of choline for women is just 278 mg, and it is even lower in vegans or those who don’t eat fish, which is expensive for those on lower incomes. To achieve an optimal intake from food alone, one would need to eat several eggs or frequent servings of fish or other marine foods most days of the week. This is unrealistic for many people.
Let’s test this from a different angle, using studies examining intakes of these nutrients and their effects on brain health, including risk of cognitive decline.
Take choline, which is richest in marine food, eggs and organ meats. A study of 125,000 people followed over 12 years, using UK Biobank data and published in the American Journal of Clinical Nutrition, found a relationship between higher dietary intake of choline and reduced dementia risk, with greatest benefits around 400 mg a day. Risk for Alzheimer’s was also lowest around this level of intake. (2)
Then there is vitamin B12, found only in foods of animal origin, and especially in marine foods and eggs. When you correctly define sufficiency (and its counterpart, deficiency) as optimal health and minimal disease risk, including dementia risk, it becomes clear that many older people need up to 500 mcg of B12 to normalise serum B12 above 500 pg/ml and to keep homocysteine below 10 µmol/L. Roughly half of those above 60 fail to meet these thresholds.
Accelerated brain shrinkage occurs below 500 pg/ml, as established by Professor David Smith’s research at Oxford University more than a decade ago. This is why several countries, such as Japan, set the “normal” range for serum B12 as above 500 pg/ml. Despite clear evidence over the past decade, both UK and US health authorities have failed to correct the reference range for vitamin B12, which is set at less than half this, namely around 180 pg/ml. (3)
A recent study of 3,000 EU children reported that the median B12 level was 347 pg/ml and one third were below 200 pg/ml. (4) This means that many children are already in the risk zone, and deficiency is more prevalent in vegan children. In older adults the problem is compounded by poor absorption, made worse by antacids, since stomach secretions are required to absorb vitamin B12. Hence, those taking PPI antacids such as omeprazole for more than 4.4 years have a 30% increased risk of dementia. (5) There is no realistic way for many older adults to achieve these required intakes of B12 from diet alone.
How do you know what you need? I recommend testing homocysteine, which tells you whether you are in the brain shrinkage risk zone above 10 µmol/L, or testing serum B12 and ensuring it is above 500 pg/ml, then supplementing accordingly with vitamin B12 or, better still, a homocysteine-lowering formula if levels fall outside the ideal range.
Vitamin D, also found in marine foods, is insufficient for a third of the year due to lack of synthesis from sunlight acting on the skin, if you rely solely on diet. A recent scientific report states: “Vitamin D3 plays a pivotal role not only in bone health but also in the functioning of the nervous system, particularly in the context of age-related neurodegenerative diseases such as Alzheimer’s disease, multiple sclerosis, and Parkinson’s disease.” (6)
Cognitive decline is far more likely if vitamin D is low, and Alzheimer’s risk is lower when levels are higher. In one large study, those who supplement vitamin D had a lower incidence of dementia. (7) Anyone who implies you can still get enough from food in winter is less enlightened than the UK Government, who recommend everyone supplements vitamin D from October to March.
Vitamin D’s protective effect depends on your blood level. I supplement 800 IU in the summer months and 3,000 IU daily in the winter months. However, the true indicator is whatever keeps my blood vitamin D level above 75 nmol/L (30 ng/ml). Bear in mind that I also eat oily fish and make a point of getting at least 20 minutes of sun exposure a day in summer months, plus a winter holiday in the sun. Factors such as darker skin, excess weight, low fish intake or low sun exposure increase your needs. That is why we recommend testing vitamin D, ideally as winter approaches and again towards the end of winter, so you know you are on track with supplementation.
A recent Chinese study published earlier this year on women over 100 found those in the highest quarter for vitamin D had an 87% reduced risk of dementia compared to those in the lowest quarter. (8) Risk was lowest in those with a blood level above 73.5 nmol/L (29.3 ng/ml). This is remarkably consistent with levels associated with benefits for bone health, immunity and many other outcomes. It is also the level our scientists set to achieve “green” on the DRIfT test. The vast majority of people in the western world do not reach this level.
This essential vitamin is produced by almost all animals but not humans. Guinea pigs also do not make it, which is why they became an experimental animal of choice: they share our vulnerability. We are dependent on vitamin C for many functions in the body, and thousands of studies support its roles, including as a key antioxidant, as a maker of collagen, and as a vital nutrient for immunity.
Yet I still read so-called experts who claim vitamin C “does nothing” for colds, cancer or anything else. Are they simply not reading the science? In very high intravenous doses, it has been used in some hospital settings, including trials in critical illness. Additionally, there are examples of trials in cancer therapy, such as a randomised trial of pharmacological ascorbate alongside chemotherapy which reported improved outcomes in metastatic pancreatic cancer. (9)
If you look at recent reviews, you’ll see statements like these:
“Mounting evidence indicates that vitamin C has the potential to be a potent anti-cancer agent when administered intravenously and in high doses.” (10)
“Supplementation with vitamin C appears to be able to both prevent and treat respiratory and systemic infections… treatment of established infections requires significantly higher (gram) doses.” (11)
Yet we still see claims that vitamin C does nothing for colds. One survey asked people if they took vitamin C and found no difference in incidence of COVID between those who did and did not supplement. But why would it? Vitamin C does not necessarily prevent infection. What it does, especially in higher doses upon infection, is support immune function and may reduce symptom severity and duration.
Even the critical comments in the media often refer to the meta-analysis of cold studies by Professor of Public Health Dr Harri Hemilä in Finland. What he actually concludes is that vitamin C shortens cold duration, with dose-response effects reported in some controlled trials, and evidence of reduced pneumonia risk in certain contexts. (12)
I take 2 grams a day. This is consistent with what many primates achieve in the wild. We struggle to get 100 mg from food alone, a fraction of what our biology seems designed to handle.
Vitamin C also appears relevant to dementia risk. In observational research, vitamin C (especially alongside vitamin E) has been associated with lower risk of cognitive decline and Alzheimer’s, and evidence reviews have included vitamin C and E among “grade 1” prevention factors in large-scale evidence mapping. (13,14)
In conclusion, Professor David Smith, former Deputy Head of the Faculty of Medical Science at Oxford University, and I drafted this statement on supplements for Food for the Brain:
‘The conventional view regarding nutritional supplements is that they are largely unnecessary if a person eats a “well-balanced diet”. This is based on recommended intakes (RDAs, RNIs) designed to prevent classical symptoms of deficiency, such as scurvy in the case of vitamin C. Blood levels of nutrients that prevent classical deficiencies are then extended to imply that a person has sufficient nutrient status if they are above these levels.
But there is abundant evidence that levels above those used to define “deficiency” may be associated with better outcomes, and these levels define a zone of “nutritional insufficiency”. There is also a growing body of evidence from well-designed studies on specific diseases showing that supplements providing nutrients beyond basic RDAs can delay, reduce or ameliorate symptoms, and that risk often reduces steadily as nutrient status rises beyond arbitrary cut-offs.
This illustrates that the definition of “deficiency” is outdated. Deficiency means a lack of efficiency. If deficiency, and its counterpart sufficiency, were to be defined as the level of a nutrient that reduces symptoms or lowers disease risk, that definition is scientifically supportable and takes into account biochemical individuality, including genetics, environment, the microbiome and an individual’s ability to absorb nutrients.
At Food for the Brain, our overarching principle is scientific integrity: consistency with the prevailing science. We share that growing body of knowledge in a way that enables people to restore, maintain, and improve mental health.
If you want ongoing support, the simplest place to start is by becoming a FRIEND of Food for the Brain. For £50 a year, you get access to our full programme of free education webinars and monthly group coaching. You also get access to the COGNITION programme. This programme is designed to help you turn evidence into practical, lasting habits. It is guidance you can trust, rooted in science. It is delivered with the support of a community working towards the same goal: better brain health for all.
One of the most powerful free resources we offer is the Cognitive Function Test. The test is a validated, research-backed way to check how your brain is functioning right now, across key cognitive domains.
It provides a meaningful baseline. It also helps track change over time, and can highlight where nutrition and lifestyle support may be most needed. If you want to understand your brain health before symptoms appear, this is the place to begin.
Many people take supplements with good intentions but no real clarity about whether they are helping. This is where DRIfT comes in.
DRIfT allows you to measure key brain-related biomarkers. You can see whether what you are taking is actually working, and where your priorities should lie. Instead of guessing, you can focus on what your brain genuinely needs, based on objective data.
Food for the Brain is a not for profit educational and research charity. It offers a free Cognitive Function Test and assesses your Dementia Risk Index. This allows them to advise you on how to dementia proof your diet and lifestyle.
By completing the Cognitive Function Test you are joining our grassroots research initiative to find out what really works for preventing cognitive decline. We share our ongoing research results with you to help you make brain-friendly choices.
Please support our research by becoming a Friend of Food for the Brain.