Skip to content

An interesting study that showed that when gut microbes are deprived of dietary fiber (their food) they start to eat the natural layer of mucus that lines the colon. (The colon is part of the large intestine). This is important because the colon's mucus layer normally acts as a barrier to pathogenic microbes. Yes, it was done in mice, but the researchers feel that this study accurately models what also happens in humans. Their conclusion: when the microbes in the gut don't get enough dietary fiber from plants (such as whole grains, fruits, vegetables, seeds, nuts), then the microbes feed on the colon's mucus layer, which results in inflammation and makes the colon more vulnerable to pathogenic (disease causing) microbes. This is what some people refer to as "leaky gut".

Research shows that changes in the diet (high fiber vs low fiber) quickly results in changes in the gut microbes in humans and rodents - so it's important to consistently eat a lot of a variety of plant fiber. Currently the recommended daily fiber intake for adults is for 28 to 35 grams (chart of some high fiber foods). They found that some bacteria strains flourished the best in low or no fiber conditions and it was these bacteria that were involved in breaking down the mucus layer. The research also showed that what are called "prebiotics" (purified forms of soluble fiber similar to what some processed foods and supplements contain) also resulted in thinning of the colon's mucus layer - they did not properly feed the gut microbes. From Medical Xpress:

High-fiber diet keeps gut microbes from eating colon's lining, protects against infection

It sounds like the plot of a 1950s science fiction movie: normal, helpful bacteria that begin to eat their host from within, because they don't get what they want. But new research shows that's exactly what happens when microbes inside the digestive system don't get the natural fiber that they rely on for food. Starved, they begin to munch on the natural layer of mucus that lines the gut, eroding it to the point where dangerous invading bacteria can infect the colon wall. In a new paper in Cell, an international team of researchers show the impact of fiber deprivation on the guts of specially raised mice. The mice were born and raised with no gut microbes of their own, then received a transplant of 14 bacteria that normally grow in the human gut. 

The findings have implications for understanding not only the role of fiber in a normal diet, but also the potential of using fiber to counter the effects of digestive tract disorders. "The lesson we're learning from studying the interaction of fiber, gut microbes and the intestinal barrier system is that if you don't feed them, they can eat you," says Eric Martens, Ph.D., an associate professor of microbiology at the University of Michigan Medical School....Using U-M's special gnotobiotic, or germ-free, mouse facility, and advanced genetic techniques that allowed them to determine which bacteria were present and active under different conditions, they studied the impact of diets with different fiber content - and those with no fiber. They also infected some of the mice with a bacterial strain that does to mice what certain strains of Escherichia coli can do to humans - cause gut infections that lead to irritation, inflammation, diarrhea and more.

The result: the mucus layer stayed thick, and the infection didn't take full hold, in mice that received a diet that was about 15 percent fiber from minimally processed grains and plants. But when the researchers substituted a diet with no fiber in it, even for a few days, some of the microbes in their guts began to munch on the mucus.They also tried a diet that was rich in prebiotic fiber - purified forms of soluble fiber similar to what some processed foods and supplements currently contain. This diet resulted in the same erosion of the mucus layer as observed in the lack of fiber.

The researchers also saw that the mix of bacteria changed depending on what the mice were being fed, even day by day. Some species of bacteria in the transplanted microbiome were more common - meaning they had reproduced more - in low-fiber conditions, others in high-fiber conditions. And the four bacteria strains that flourished most in low-fiber and no-fiber conditions were the only ones that make enzymes that are capable of breaking down the long molecules called glycoproteins that make up the mucus layer....  Just like the mix of bacteria, the mix of enzymes changed depending on what the mice were being fed, with even occasional fiber deprivation leading to more production of mucus-degrading enzymes.

Images of the mucus layer, and the "goblet" cells of the colon wall that produce the mucus constantly, showed the layer was thinner the less fiber the mice received. While mucus is constantly being produced and degraded in a normal gut, the change in bacteria activity under the lowest-fiber conditions meant that the pace of eating was faster than the pace of production - almost like an overzealous harvesting of trees outpacing the planting of new ones. 

When the researchers infected the mice with Citrobacter rodentium - the E. coli-like bacteria - they observed that these dangerous bacteria flourished more in the guts of mice fed a fiber-free diet. Many of those mice began to show signs of illness and lost weight. When the scientists looked at samples of their gut tissue, they saw not only a much thinner or even patchy mucus later - they also saw inflammation across a wide area. Mice that had received a fiber-rich diet before being infected also had some inflammation but across a much smaller area. [Original study]

A thick mucus layer (green), generated by the cells of the colon's wall, provides protection against invading bacteria and other pathogens. This image of a mouse's colon shows the mucus (green) acting as a barrier for the "goblet" cells (blue) that produce it. Credit: University of Michigan

This study found impressive results - improvement in autistic behaviors in children diagnosed with autism spectrum disorder (ASD) with four months of daily vitamin D supplementation. Children in the placebo group did not show improvement. A nice aspect of the study was that the children were randomly assigned  to a placebo or a vitamin D group (so that the groups were not self-selected) and it was double-blinded (so no one knew who was getting the vitamins - again to prevent bias). This was a preliminary study - meaning more studies are needed, but it would be amazing if these results hold up... From Science Daily:

Vitamin D supplements may benefit children with autism spectrum disorder

Studies have shown an association between the risk of autism spectrum disorder and vitamin D insufficiency. In this latest study, 109 children with autism spectrum disorder were randomized to receive four months of vitamin D3 supplementation or a placebo."Autism symptoms -- such as hyperactivity, social withdrawal, and others -- improved significantly following vitamin D3 supplementation but not after receiving placebo," said Dr. Khaled Saad, lead author of the Journal of Child Psychology and Psychiatry study.

Excerpts from the original study from  The Journal of Child Psychology and Psychiatry: Randomized controlled trial of vitamin D supplementation in children with autism spectrum disorder

Autism spectrum disorder (ASD) is a frequent developmental disorder characterized by pervasive deficits in social interaction, impairment in verbal and nonverbal communication, and stereotyped patterns of interests and activities. It has been previously reported that there is vitamin D deficiency in autistic children; however, there is a lack of randomized controlled trials of vitamin D supplementation in ASD children.

This study is a double-blinded, randomized clinical trial (RCT) that was conducted on 109 children with ASD (85 boys and 24 girls; aged 3–10 years). The aim of this study was to assess the effects of vitamin D supplementation on the core symptoms of autism in children. ASD patients were randomized to receive vitamin D3 or placebo for 4 months. The serum levels of 25-hydroxycholecalciferol (25 (OH)D) were measured at the beginning and at the end of the study. The autism severity and social maturity of the children were assessed by the Childhood Autism Rating Scale (CARS), .... 
Supplementation of vitamin D was well tolerated by the ASD children. The daily doses used in the therapy group was 300 IU vitamin D3/kg/day, not to exceed 5,000 IU/day. The autism symptoms of the children improved significantly, following 4-month vitamin D3 supplementation, but not in the placebo group. This study demonstrates the efficacy and tolerability of high doses of vitamin D3 in children with ASD.   

Recently, Wang et al. (2016) performed a systematic review and meta-analysis of all studies on serum concentration of 25 (OH)D in ASD (Wang et al., 2016). Eleven studies were included, accounting for a total of 870 ASD patients and 782 healthy controls. Serum levels of 25 (OH)D in participants with ASD were significantly lower than those in controls. They concluded that low vitamin D might serve as a risk factor for autism spectrum disorder (Wang et al., 2016). 

In a recent survey, our research group measured 25 (OH)D in 122 ASD children (3–9 years old) and 100 healthy children as controls (Saad, Abdel-Rahman, et al., 2015). The ASD group showed a significantly lower level of serum 25 (OH)D compared with the control group (p < .0001). The study found highly significant inverse correlations between serum 25 (OH)D levels and autism rating scales. In the second part of the previous study (Saad, AbdelRahman, et al., 2015), an open-label trial of 83 subjects who completed a 3-month therapy with high daily doses of vitamin D (300 IU/kg/day) was performed. Collectively, 80.7% of the children with ASD had significantly improved outcome, which was mainly in the sections of the CARS and ABC subscales that measure behavior, stereotypy, eye contact, and attention span (Saad, Abdel-Rahman, et al., 2015). 

Guidelines for how to prevent food allergies in children are changing. Until very recently, it was avoid, avoid, avoid exposing babies or young children to any potential allergens. Remember parents being advised that if an allergy to X (whether pets or food) runs in the family, then absolutely avoid exposing the child to the potential allergen? Well, recent research (herehere, and here) found that the opposite is true - that in the first year of life the baby should be exposed to potential allergens (whether animals or food) which stimulates the child's developing immune system in beneficial ways.

Physicians at a recent conference of allergists said that evidence shows that allergenic foods — including peanuts, eggs, and milk — should be introduced in the first year of life. The new 2017 medical guidelines will recommend introducing small amounts of peanuts (mixed in with other foods), when children are 4 to 6 months of age..

About two years ago a landmark study (LEAP study) found that when infants at a high risk of developing peanut allergy consumed peanuts on a regular basis, their risk of peanut allergy was dramatically reduced. And the opposite was also true: peanut avoidance in the first year of life was associated with a greater frequency of peanut allergy. Which made doctors start to rethink their strategies of how to avoid food allergies. From Medscape:

Allergenic Foods Should Be Introduced to Infants Early

Although the evidence shows that allergenic foods — including peanuts, eggs, and milk — should be introduced in the first year of life, guidelines are lagging behind, said an allergist speaking here at the American College of Allergy, Asthma & Immunology (ACAAI) 2016 Annual Scientific Meeting. Official guidelines to be issued early in 2017 will address only peanuts, recommending introduction when children are 4 to 6 months of age.

"There is now a large body of observation and trial data for other foods, including egg, that show that delaying the introduction of allergenic solids increases the risk of those particular food allergies," said Katrina Allen, MBBS, PhD, from the Murdoch Childrens Research Institute in Melbourne, Australia. Policy changes are needed to help guide parents' decisions, she said. In fact, there is evidence showing that changes to policy — namely, infant-feeding guidelines — mirror the rise in the incidence of food allergies.

Not everyone agrees on exposure amount and timing in the case of egg allergy. In a recent trial, researchers looked at the early introduction of allergenic foods in breast-fed children (N Engl J Med. 2016;374:1733-1743). The prevalence of any food allergy was significantly lower in the early-introduction group than in the standard-introduction group, as was the prevalence of peanut allergy and egg allergy. And a study Dr Allen was involved in, which introduced cooked egg in small amounts, showed that early introduction reduced allergy (J Allergy Clin Immunol. 2010;126:807-813).

However, in a German study, where greater amounts of egg were introduced at 4 to 6 months, early exposure increased the risk for life-threatening allergic reactions (J Allergy Clin Immunol. Published online August 12, 2016). And in the STEP study, there was no change in the number of food allergies in 1-year-old children when egg was introduced early (J Allergy Clin Immunol. Published online August 20, 2016). However, that did not take into account high-risk infants, particularly those with eczema, who are known to have a higher incidence of egg allergy and are likely to see a much greater benefit from the early introduction of egg.

The new peanut guidelines — coauthored by Amal Assa'ad, MD, from the Cincinnati Children's Hospital, who is chair of the ACAAI food allergy committee — will recommend that children with no eczema or egg allergy can be introduced to peanut-containing foods at home, according to the family's preference. And for children with mild to moderate eczema who have already started solid foods, the guidelines say that peanut-containing foods can be introduced at home at around 6 months of age, without the need for an evaluation. However, the guidelines caution, peanut-containing foods should not be the first solid food an infant tries, and an introduction should be made only when the child is healthy. The first feeding should not happen when the child has a cold, is vomiting, or has diarrhea or another illness.

For eggs, there is no official recommendation as of yet....The early introduction of allergenic foods is not the only policy that needs to be changed to lower the incidence of food allergies, Dr Allen told Medscape Medical News. Other factors, particularly environmental factors — mostly written up in observational studies — are contributing to an increasing intolerance to allergenic foods. Policies advocating that kids "get down and dirty," have more exposure to dogs, and bathe less are also warranted....Dr Allen and Dr Assa'ad agree that delaying the introduction of foods such cow's milk and egg until after 12 months is harmful. Guidelines should encourage families to introduce these foods in the first year of life, once solids have commenced at around 6 months, but not before 4 months.

An article points out what we should all be concerned with, but is being ignored - pesticide residues of glyphosate (found in Monsanto's Roundup) and 2,4-D in foods. Glyphosate is the most used pesticide in the world, and both glyphosate and 2,4-D pesticide residues in food are set to really increase with the introduction of genetically modified crops resistant to both pesticides (The farmers can spray the pesticides against weeds repeatedly and the crops won't die - they're resistant to the pesticides, so the crops and foods will contain pesticide residues.). Both pesticides are linked to a variety of health problems.

The FDA, under pressure, started analyzing some foods this year for glyphosate residues, but has already stopped due to "problems". When and if they resume is unknown. It is important to know that pesticide residues at varying levels have been found in many foods (here, here, and here). What this chronic low level exposure does to humans is unknown. The only way to avoid or minimize pesticide exposures is to eat organic foods (herehere, and here).

When reading the following excerpts, note that there are no standards for glyphosate residues in honey in the United States. Research by a FDA chemist and a chemist at the University of Iowa found residues of glyphosate at varying levels with a high of 653 parts per billion in one sample (in honey from Iowa) - which is more than 10 times the limit of 50 ppb allowed in the European Union. Other honey samples tested detected glyphosate residues from under 16 ppb to over 123 parts per billion ppb (in honey from Louisiana). I mention this because pesticide residues are an important issue - because we don't know what chronic exposure to mixtures of low levels of pesticides in foods does to us. To babies and children, to pregnant women, to the elderly, to all of us. The following article excerpts are by journalist Carey Gillam, from The Huffington Post:

FDA Suspends Testing for Glyphosate Residues in Food

Government testing for residues of an herbicide that has been linked to cancer has been put on hold, slowing the Food and Drug Administration’s first-ever endeavor to get a handle on just how much of the controversial chemical is making its way into U.S. foods. The FDA, the nation’s chief food safety regulator, launched what it calls a “special assignment” earlier this year to analyze certain foods for residues of the weed killer called glyphosate after the agency was criticized by the U.S. Government Accountability Office for failing to include glyphosate in annual testing programs that look for many less-used pesticides. Glyphosate is the most widely used herbicide in the world, and is the key ingredient in Monsanto Co.’s branded Roundup herbicide line.

Glyphosate is under particular scrutiny now after the World Health Organization’s cancer experts last year declared the chemical a probable human carcinogen. Several private groups and nonprofits have been doing their own testing, and have been finding glyphosate residues in varying levels in a range of foods, raising consumer concerns about the pesticide’s presence in the American diet.

The FDA’s residue testing for glyphosate was combined with a broader herbicides analysis program the FDA set in motion in February of this year. But the glyphosate testing has been particularly challenging for the FDA. The agency was finally forced to put the glyphosate residue testing part of the work plan on hold amid confusion, disagreement and difficulties ....FDA spokeswoman Megan McSeveney confirmed the testing suspension and said the agency is not sure when it will resume.....Alongside the testing for glyphosate, the FDA laboratories have also been analyzing foods for 2,4-D and other “acid herbicides,” documents obtained from the FDA show. The category of acid herbicides includes five of the top 10 active ingredients used in homes and gardens. Usage of 2,4-D is expected to triple in the coming year, according to the FDA.

McSeveney said glyphosate residues were only being analyzed in soy, corn, milk and eggs and the popcorn samples, while the other foods are being tested for residues of other herbicides. Earlier this year, one of the agency’s senior chemists also analyzed glyphosate residues in honey and oatmeal and reported his results to the agency. Some honey samples contained residue levels well over the limit allowed in the European Union. The United States has no legal tolerance for glyphosate in honey, though the Environmental Protection Agency (EPA) said recently it may set one because of the FDA findings. 

With the testing on hold, it is not clear when the agency might have final results on the glyphosate residue analysis. McSeveney said preliminary results showed no violations of legal tolerance levels allowed for glyphosate in the foods tested. She did not provide details on what, if any, levels of residue were found. Tolerance levels are set by the EPA for a variety of pesticides expected to be found in foods. When residue levels are detected above the tolerance levels, enforcement action can be taken against the food producers.

Though FDA annually tests domestic and imported foods for residues of other pesticides, it never tested for glyphosate before. It has not routinely tested for 2,4-D either, a fact also criticized by the GAO. The FDA testing for 2,4-D residues comes as the use of 2,4-D with food crops is expected to start rising due to the commercialization of new formulated herbicide products that combine glyphosate and 2,4-D. Safety questions have been raised about the combination. But the EPA gave a green light on Nov. 1 to a Dow AgroSciences’ herbicide combination of glyphosate and 2,4-D. The new products are intended to counter widespread weed resistance to glyphosate, and be used with new types of genetically engineered herbicide-tolerant crops.

Vitamin D deficiency has been implicated in a variety of cancers (herehere, and here). Now a study found that vitamin D levels are linked to the long-term outcome in women with breast cancer. Researchers found that after 7 years, women with the highest levels of vitamin D had about a 30 percent better likelihood of survival from breast cancer than women with the lowest levels of vitamin D, The study put women into one of three groups based on their levels of vitamin D (as measured in the blood 2 months after the initial breast cancer diagnosis): deficient - levels below 20.0 ng/mL; insufficient - 20.0 to 29.9 ng/mL; and sufficient - greater than or equal to 30.0 ng/mL. They found that almost half of the women were vitamin D deficient, and another third were insufficient.

The researchers said the findings "provide compelling observational evidence for inverse associations between vitamin D levels and risk of breast cancer progression and death". In other words, the higher the vitamin D levels, the better the outcome. NOTE: Good vitamin D levels can usually be obtained with one 1000 IU supplement of vitamin D3 per day. Or expose bare skin to sunlight - after all, it is called the "sunshine vitamin". From Medical Xpress:

Higher vitamin D levels associated with better outcomes in breast cancer survivors

Women with higher vitamin D levels in their blood following a breast cancer diagnosis had significantly better long-term outcomes, according to new research from Kaiser Permanente and Roswell Park Cancer Institute....Vitamin D is a nutrient best known for its role in maintaining healthy bones; conversely, vitamin D deficiency has been associated with the risk for several cancers.

Common sources of vitamin D include sun exposure, fatty fish oils, vitamin supplements, and fortified milks and cereals. While the mechanisms for how vitamin D influences breast cancer outcomes are not well understood, researchers believe it may be related to its role in promoting normal mammary-cell development, and inhibiting the reproduction of and promoting the death of cancer cells.

"We found that women with the highest levels of vitamin D levels had about a 30 percent better likelihood of survival than women with the lowest levels of vitamin D," said Lawrence H. Kushi, ScD....principal investigator of Kaiser Permanente's Pathways study of breast cancer survivorship. The current study included 1,666 Pathways study members who provided samples between 2006 and 2013....had a diagnosis of invasive breast cancer in 2006. Participants provided blood samples within two months of diagnosis and answered questions about diet, lifestyle and other risk factors, with follow-ups at six months and at two, four, six and eight years.

"With the extremely rich data sources from a large sample size, we were able to prospectively analyze three major breast cancer outcomes—recurrence, second primary cancer and death," said Song Yao, PhD, associate professor of oncology at Roswell Park Cancer Institute and the study's lead author....In addition to lower overall mortality among all breast cancer survivors studied, the researchers found even stronger associations among premenopausal women in the highest third of vitamin D levels for breast-cancer-specific (63 percent better), recurrence-free (48 percent better) and invasive-disease-free survival (42 percent better), during a median follow up of seven years. [Original study]

An earlier post discussed how emulsifiers (which are added to most processed foods to aid texture and extend shelf life) can alter the the community of microbes that live in our gut (gut microbiota) in such a way as to cause intestinal inflammation. Now the same researchers found that regular consumption of emulsifiers alter intestinal bacteria in a manner that promotes low-grade intestinal inflammation and possibly colorectal cancer.

The emulsifiers used in the study were the commonly used carboxymethylcellulose and polysorbate-80, but some others are soy lecithin, carrageenan, and polyglycerol ester. Processed foods often contain several emulsifiers, and while food regulations limit the amount of each emulsifier present in a particular food product to 1% to 2%, they don’t restrict the number of emulsifiers allowed. The study was done in mice, but the researchers tried to model the level of exposure of humans who eat a lot of processed food. From Science Daily:

Common food additive promotes colon cancer in mice

Emulsifiers, which are added to most processed foods to aid texture and extend shelf life, can alter intestinal bacteria in a manner that promotes intestinal inflammation and colorectal cancer, according to a new study. The findings, published in the journal Cancer Research, show regular consumption of dietary emulsifiers in mice exacerbated tumor development....There is increasing awareness that the intestinal microbiota, the vast, diverse population of microorganisms that inhabits the human intestines, play a role in driving colorectal cancer.

The microbiota is also a key factor in driving Crohn's disease and ulcerative colitis, the two most common forms of inflammatory bowel disease (IBD). IBD is known to promote colon tumorigenesis and gave rise to the term "colitis-associated cancer." Low-grade inflammation, a condition more prevalent than IBD, was shown to be associated with altered gut microbiota composition and metabolic disease and is observed in many cases of colorectal cancer. These recent findings suggest dietary emulsifiers might be partially responsible for this association.

Previous reports by the Georgia State research team suggested that low-grade inflammation in the intestine is promoted by consumption of dietary emulsifiers, which are detergent-like molecules incorporated into most processed foods that alter the composition of gut microbiota. The addition of emulsifiers to food seems to fit the time frame and had been shown to promote bacterial translocation across epithelial cells. Viennois and Chassaing hypothesized that emulsifiers might affect the gut microbiota in a way that promotes colorectal cancer. They designed experiments in mice to test this possibility.

In this study, the team fed mice with two very commonly used emulsifiers, polysorbate 80 and carboxymethylcellulose, at doses seeking to model the broad consumption of the numerous emulsifiers that are incorporated into the majority of processed foods. Researchers observed that consuming emulsifiers drastically changed the species composition of the gut microbiota in a manner that made it more pro-inflammatory, creating a niche favoring cancer induction and development. Alterations in bacterial species resulted in bacteria expressing more flagellin and lipopolysaccharide, which activate pro-inflammatory gene expression by the immune system.

Another study finding a link with low levels of vitamin D and a health problem - this time an increased risk of bladder cancer. Vitamin D is frequently called the "sunshine vitamin" because sunlight is the best source of vitamin D (our body makes vitamin D3 from sunlight exposure on our bare skin). If you take vitamin D supplements, look for vitamin D3 (rather than D2). From Medical Xpress:

Low vitamin D levels linked to increased risk of bladder cancer

Vitamin D deficiency is associated with an increased risk of developing bladder cancer, according to a systematic review of seven studies presented today at the Society for Endocrinology annual conference in Brighton. Though further clinical studies are needed to confirm the findings, the study adds to a growing body of evidence on the importance of maintaining adequate vitamin D levels.

Vitamin D, which is produced by the body through exposure to sunshine, helps the body control calcium and phosphate levels. Vitamin D can also be obtained from food sources such as fatty fish and egg yolks. Previous studies have linked vitamin D deficiency with a host of health problems including cardiovascular disease, cognitive impairment, autoimmune conditions, and cancer.

In this work, researchers from the University of Warwick and University Hospital Coventry and Warwickshire, Coventry and the investigated the link between vitamin D and bladder cancer risk. They reviewed seven studies on the topic which ranged from having 112 to 1125 participants each. Five out of the seven studies linked low vitamin D levels to an increased risk of bladder cancer.

In a separate experiment, the researchers then looked at the cells that line the bladder, known as transitional epithelial cells, and found that these cells are able to activate and respond to vitamin D, which in turn can stimulate an immune response. According to lead author of the study Dr Rosemary Bland, this is important because the immune system may have a role in cancer prevention by identifying abnormal cells before they develop into cancer. "....our work suggests that low levels of vitamin D in the blood may prevent the cells within the bladder from stimulating an adequate response to abnormal cells," said Dr Bland. 

  Eating lots of fruits and vegetables (more than 10 servings a day!)  is linked to better cognitive functioning in both normal weight and overweight adults (both young and older adults), and may delay the onset of cognitive decline that occurs with aging and also dementia. Overweight and obese older adults with a daily fruit and vegetable consumption of less than 5 servings generally had worse cognitive functioning. Higher levels of physical activity and higher daily fruit and vegetable consumption were both associated with better cognitive functioning. Cognitive functioning generally refers to a person’s ability to reason and think, the mental processes needed to gather and process information, and all aspects of language and memory.

The York University researchers found that fruit and vegetable consumptionphysical activity, and BMI or body mass index (normal, overweight, obese) all appear to interact in how a person mentally functions (cognitive functioning), especially as they age. The ideal goal as one ages is to preserve the mind. It appears that eating lots of fruits and vegetables daily (10 or more servings), being physically active (this includes daily walks), and being a healthy weight help with this goal. It helps to also be highly educated (or read books?) so that the brain has a "cognitive reserve",

Why is daily fruit and vegetable consumption (FVC) good for cognitive functioning and the brain? Studies find that daily consumption of fruits and vegetables is strongly associated with a reduced risk of cardiovascular disease, cancer, diabetes and age-related declines. They appear to be "protective" against cognitive decline. The study researchers point out that fruits and vegetables contain high quantities of vitamin C and E, fiber, micronutrients, flavonoids, beta-carotenes and other classes of phytochemicals. These are important in various ways: "they modulate detoxifying enzymes, stimulate the immune system, modulate cholesterol synthesis, and act as antibacterial, antioxidant or neuroprotective agents." NOTE: A serving of fruit is generally 1 medium fruit or 1/2 cup of fruit. A serving of vegetables is 1 cup of raw leafy greens or 1/2 cup of other vegetables. From Medical Xpress:

Healthy living linked to higher brain function, delay of dementia

It's tempting to dip into the leftover Halloween treats, but new research out of York University has found eating plenty of fruits and vegetables, combined with regular exercise, leads to better cognitive functioning for younger and older adults, and may delay the onset of dementia. York U post-doctoral fellow Alina Cohen and her team, including Professors Chris I. Ardern and Joseph Baker, looked at cross-sectional data of 45,522 participants, age 30 to 80+, from the 2012 annual component of the Canadian Community Health Survey.

What they found was that for those who are normal weight or overweight, but not obese, eating more than 10 servings of fruit and vegetable daily was linked to better cognitive functioning. When moderate exercise was added, those eating less than five servings, reported better cognitive functioning. Higher levels of physical activity were linked to the relationship between higher daily fruit and vegetable consumption and better cognitive performance. Those with higher body mass indexes, low activity levels and fruit and vegetable consumption were associated with poorer cognitive functioning.

More details from the original study in the Journal of Public Health: Physical activity mediates the relationship between fruit and vegetable consumption and cognitive functioning: a cross-sectional analysis

Results: Higher BMIs, lower PA [physical activity] and FVC [fruit and vegetable consumption] were associated with poorer cognitive functioning. Additionally, PA statistically mediated the relationship between FVC and cognitive function (Sobel test: t = −3.15; P < 0.002); and higher education levels and daily FVC were associated with better cognitive function (P < 0.001). Conclusion: Higher PA levels were associated with better cognitive functioning in younger and older adults. Also, higher daily FVC and education levels were associated with better cognitive scores.

Individuals who were normal weight or overweight and reported a FVC of >10 servings per day reported better cognitive functioning scores than those who reported <10 servings, as well as those individuals with obesity . As well, both active and inactive individuals who reported a FVC of >10 servings per day had better cognitive scores than those who consumed fewer servings. However, in those who were moderately active, individuals with a daily FVC of <5 or 5–10 servings reported better cognitive functioning than those with a daily FVC of 10 or more servings; this may have resulted because of underestimations of the number of servings of fruits and vegetables actually consumed... Thus, increasing FVC and PA levels as well as having a healthy BMI may aid in the delay of cognitive decline.

Results also indicated that higher education levels along with a daily FVC of five or more servings were associated with better cognitive functioning. Education may be assisting in the process of delaying cognitive decline by increasing cognitive reserve, the ability of the human brain to cope with damage by using different brain processes to retain the ability to function well. Cognitive reserve is developed through intellectual stimulation and translates into a higher volume of connections between neurons and stronger rates of cerebral blood flow.

2

I've been thinking a lot about pesticides in foods and pesticides in honey since the recent post about untreated lawns and bees. Research finds that untreated lawns (no pesticides of any kind) with their diversity of flowering weeds or "spontaneous flowering plants" (such as clover and dandelions) are actually great pollen and nectar sources for bees. In other words, untreated lawns are great bee habitats! We think of conventional farms as using a lot of pesticides, but that is also true of suburbia with its obsession with "perfect lawns" and gardens, with no weeds allowed. According to U.S. Fish and Wildlife Service: Up to ten times more pesticides per acre are used on suburban lawns than on conventional farms! Perhaps it's time to view clover and other "spontaneous flowering plants" as beneficial wildflowers providing food for bees, and not undesirable weeds that need eradicating with pesticides.

Thus these recent articles about pesticide residues in foods, including honey, caught my eye. Notice that conventional foods have pesticide residues, but not organic foods (or they may have much lower pesticide residues, typically due to contamination from neighboring farms). Honey is tricky - there may be pesticide residues because bees fly to pesticide contaminated areas, and also beekeepers may use pesticides for pest control when caring for their bees and bee hives. The most uncontaminated honey would be from organic beekeepers located in pristine areas, with no industry, farming, or treated lawns nearby, and who do not buy "wax starter comb". Bees forage for nectar and pollen within 2 miles from the bee hive, but may fly up to 7 miles from the bee hive. As an article in Scientific Reports pointed out: "Any agrochemical applied anywhere within a colony's extensive reach can end up back in the hive."

Note that no one knows what the long-term health effects are of ingesting foods daily with low levels of mixtures of pesticides (chronic exposure), and also of ingesting endocrine disruptors. A "cocktail effect" may occur from combined traces of different pesticides - the chemicals may be more toxic when combined than alone. We just don't know. The FDA just started testing for glyphosate residues (glyphosate is the active ingredient in Roundup) in 2016 - and this is the most heavily used pesticide in the world! Currently nearly 300 million pounds of glyphosate are applied each year on U.S. farms. Now add in all the other ways we're exposed to pesticides - both indoors and outdoors  lawns, gardens, farms), even on our treated pets. Something to think about.

A number of Kellogg and Nestle brand muesli cereals were among those tested in the following research. Muesli is a breakfast cereal of rolled oats, perhaps other grains, dried fruit, and nuts.  ...continue reading "Pesticide Residues In Honey and Other Foods"

Lead exposure is a big problem for children throughout the United States and the rest of the world - whether lead from plumbing, lead paint, lead solder, and even from nearby mining. There are no safe levels of lead in children (best is zero) because it is a neurotoxicant - thus it can permanently lower IQ scores as well as other neurological effects. More lead gets absorbed if the person also has an iron deficiency than if the person has normal iron levels.

This study found that simply eating iron fortified biscuits daily lowered lead levels (and improved iron levels) in children during a several month period. Two types of iron supplements were tested, and it was found that sodium iron EDTA (which is commonly added to foods) worked better than iron sulphate. The CDC (Centers for Disease Control) recommend eating foods high in calcium, iron, and vitamin C to lower iron absorption. From Science Daily:

Iron supplements in the fight against lead

Lead is a toxic heavy metal that was added to gasoline for use in cars until as recently as 25 years ago. It is particularly harmful to the developing brains of infants, children and teenagers, and the damage it does is irreversible. The situation becomes significantly worse if people are exposed to a high level of lead at the same time as they are suffering from iron deficiency. In the small intestine, lead and iron bind to the same transport protein, which absorbs the metals into the bloodstream. If someone consumes too little iron with their food, the transporter increases its activity, and can carry lead into the bloodstream instead, leading to increased levels of the toxic heavy metal in the body and brain.

A team of researchers led by ETH professor Michael B. Zimmermann from the Laboratory of Human Nutrition have now shown in a study that fortifying food with iron produces a striking reduction in blood lead concentration in children exposed to high levels of the metal. This is the result of a trial involving over 450 children carried out by Zimmermann's former doctoral student Raschida Bouhouch and colleagues in southern Morocco.....Mining in the surrounding area meant that children of preschool and school age were exposed to an increased quantity of lead. At the same time, the level of iron in their blood was relatively low, placing them in a high-risk group.

Depending on their weight, the children were given several white-flour biscuits on a daily basis for a period of four and a half months. The biscuits were fortified with different iron preparations: some received biscuits containing a specific quantity of iron sulphate, while others received biscuits with sodium iron EDTA or sodium EDTA without iron. To test the effect of the iron supplements, some children received only placebo biscuits containing no additional iron. EDTA, which stands for ethylene diamine tetraacetic acid, forms stable complexes with iron, aiding its uptake into the bloodstream from the intestines, but it is not absorbed itself. EDTA can also bind to lead in the intestines, reducing its absorption....Sodium iron EDTA has already been used for iron fortification in foodstuffs for many years.

The researchers measured the children's blood lead concentration and iron status before and after the trial, as well as conducting tests to determine how well the children could solve cognitive tasks. The researchers were delighted to find that the biscuits fortified with iron did indeed reduce the level of lead in the blood -- specifically, by a third with sodium iron EDTA complexes and by a quarter with EDTA and iron sulphate.

Before the study began, the children's blood contained on average 4.3 micrograms of lead per decilitre. Biscuits with added sodium iron EDTA facilitated a reduction in blood lead concentration to 2.9 micrograms per decilitre. The biscuits also brought about an improvement in the children's iron status. On the other hand, the reduction in lead concentration had no effect on cognitive performance, as the researchers discovered during the corresponding tests. Although, contrary to the researchers' expectations, the children's blood lead concentration before supplementation with iron was in line with the worldwide average at 4.3 micrograms per decilitre of blood, it was still possible to achieve a considerable reduction by administering the biscuits.