Skip to content

I've been asked whether vegan diets are safe during pregnancy. And I've always said that I don't know, but that avoiding all meat, fish, eggs, and dairy concerns me. Vegan diets are diets without meat, fish, dairy, honey, and eggs (no animal derived food), but while vegetarian diets also avoid meat, they do include eggs, honey, milk, and dairy products. Thus it is very important that anyone following a vegan diet plan meals carefully to get all the necessary nutrients. For example, soda and french fries are vegan, but are not good nutritionally.

There are a number of nutrients that probably need to be supplemented in vegan diets during pregnancy, especially B-12, iron, zinc, iodine (can get by using iodized salt), and omega-3s, and that's not even discussing micronutrients. So it was good to see this article by Dr. Drew Ramsey (Columbia University, NY) raising those same concerns about nutrient deficiencies during pregnancy, and pointing out that very few studies have looked at vegetarian and vegan diets in pregnancy - that the evidence is "scant". This topic is so controversial that as of today there were 85 comments by health professionals after the following Medscape article - both strongly pro and con, and studies given that show that vegans tend to have lower levels of a number of nutrients measured (zinc, iron, omega-3s, etc).

There are also reports of babies exclusively breast-fed by vegan mothers (here and here) who did not take vitamin supplements that developed nutritional deficiencies (especially B-12, vitamin K, and vitamin D). This is because what a mother eats is what the baby gets during pregnancy and breast-feeding.

On the other hand, the Academy of Nutrition and Dietetics came out with a position statement in 2016 stating that "appropriately planned vegetarian, including vegan, diets are healthful, nutritionally adequate, and may provide health benefits for the prevention and treatment of certain diseases. These diets are appropriate for all stages of the life cycle, including pregnancy, lactation, infancy, childhood, adolescence, older adulthood, and for athletes". But that article (in the Journal of Academy of Nutrition and Dietetics) then goes on in depth about several nutrients that can be lower or deficient in vegan diets unless supplements are given (B-12, iodine, etc). From Medscape;

Are Vegan and Vegetarian Diets Safe During Pregnancy?

Are vegan and vegetarian diets safe during pregnancy?....Vegan and vegetarian diets are plant-based diets....Plants, in general, are a great choice, especially colorful plants, because they tend to be more nutrient-dense. Plant-based diets have been linked to a number of health benefits, such as lower body mass index and lower rates of obesity and diabetes, as well as conferring some benefits. Certainly, vegan and vegetarian diets have higher amounts of certain nutrients like magnesium, folate, and fiber, all of which are generally consumed in very low quantities in Western diets.

Focusing on pregnancy, there are two important papers to note. The first, which got me very interested in this issue, was a series of pooled case reports by Drs Dror and Allen in 2008.[1] They looked at 30 cases of severe vitamin B12 deficiency during pregnancy in vegan women with pernicious anemia. Among the 30 vegan women who had B12 deficiency during pregnancy, about 60% of their offspring had severe developmental delays and 37% had cerebral atrophy.... The clinical importance of this is to partner with our patients who are eating plant-based diets and ensure that they have adequate levels of vitamin B12, as well as iron, zinc, and long-chain omega-3 fats during pregnancy.

The second article to take a peek at is from the British Journal of Obstetrics and Gynaecology in 2015.[2] This was a systematic review that looked at all of the literature after screening out the papers. The authors found about 13 papers focusing on maternal and infant outcomes, and about nine of those looked at nutrient deficiency. The main headline of this review is that there are no randomized clinical trials of vegan or vegetarian diets in pregnancy. It is very hard to make a clear clinical recommendation. The epidemiologic data were heterogeneous. As the researchers called it, it was "scant." That is certainly true. Overall, there were no clear associations with bad outcomes. There was some increased risk for hypospadias in one of the larger trials that looked at vegan and vegetarian diets during pregnancy.

The main concern is that vegan and vegetarian diets put patients at risk for a number of nutrient deficiencies—vitamin B12 in vegan diets (and even for vegetarians), iron, zinc, and the long-chain omega-3 fats. Just a quick moment on the long-chain omega-3 fats: Dietarily, these only come from fatty fish and seafood. They are bioconcentrated. They start in algae, but they get bioconcentrated in our food supply via seafood. It is very hard to get those during pregnancy. Certainly, a lot of healthy babies have been born to vegetarian and vegan women. You can supplement, but given the benefits we see with omega-3 fats in patients with things like mood disorders or a history of psychotic disorders, I think it is important to consider the long-chain omega-3 fats. If someone is not going to get those in their diet but has a history, risk, or propensity toward mood disorders, think about supplementation. 

Excerpts from the review study (mentioned above) in BJOG: Vegan-vegetarian diets in pregnancy: danger or panacea? A systematic narrative review.

To review the literature on vegan-vegetarian diets and pregnancy outcomes [in healthy pregnant women]....None of the studies reported an increase in severe adverse outcomes or in major malformations, except one report of increased hypospadias in infants of vegetarian mothers. Five studies reported vegetarian mothers had lower birthweight babies, yet two studies reported higher birthweights. The duration of pregnancy was available in six studies and was similar between vegan-vegetarians and omnivores. The nine heterogeneous studies on microelements and vitamins suggest vegan-vegetarian women may be at risk of vitamin B12 and iron deficiencies.

CONCLUSIONS: The evidence on vegan-vegetarian diets in pregnancy is heterogeneous and scant. The lack of randomised studies prevents us from distinguishing the effects of diet from confounding factors. Within these limits, vegan-vegetarian diets may be considered safe in pregnancy, provided that attention is paid to vitamin and trace element requirements.

New research further confirms a link between higher lutein levels (as measured in the blood) and the preservation of "crystallized intelligence" in older adults.  Crystallized intelligence is the ability to use the skills and knowledge one has acquired over a lifetime. Lutein is in foods such as leafy green vegetables, cruciferous vegetables (broccoli, brussels sprouts, cauliflower, and cabbage) and egg yolks. Lutein is also found in small amounts in other fruits and vegetables. Bottom line: eat a variety of fresh fruits and vegetables daily. [Original study.]Medical Xpress:

Study links nutrition to brain health and intelligence in older adults

A study of older adults links consumption of a pigment found in leafy greens to the preservation of "crystallized intelligence," the ability to use the skills and knowledge one has acquired over a lifetime.

Lutein (LOO-teen) is one of several plant pigments that humans acquire through the diet, primarily by eating leafy green vegetables, cruciferous vegetables such as broccoli, or egg yolks, said University of Illinois graduate student Marta Zamroziewicz, who led the study with Illinois psychology professor Aron Barbey. Lutein accumulates in the brain, embedding in cell membranes, where it likely plays "a neuroprotective role," she said. "Previous studies have found that a person's lutein status is linked to cognitive performance across the lifespan," Zamroziewicz said. 

The study enrolled 122 healthy participants aged 65 to 75 who solved problems and answered questions on a standard test of crystallized intelligence. Researchers also collected blood samples to determine blood serum levels of lutein and imaged participants' brains using MRI to measure the volume of different brain structures. The team focused on parts of the temporal cortex, a brain region that other studies suggest plays a role in the preservation of crystallized intelligence.

The researchers found that participants with higher blood serum levels of lutein tended to do better on tests of crystallized intelligence. Serum lutein levels reflect only recent dietary intakes, Zamroziewicz said, but are associated with brain concentrations of lutein in older adults, which reflect long-term dietary intake. Those with higher serum lutein levels also tended to have thicker gray matter in the parahippocampal cortex, a brain region that, like crystallized intelligence, is preserved in healthy aging, the researchers report..... "Our findings do not demonstrate causality," Zamroziewicz said. "We did find that lutein is linked to crystallized intelligence through the parahippocampal cortex."

Magnesium is a mineral found in the human body that is necessary for good health. New research analysed 40 studies and found that a diet rich in magnesium is associated with a reduced risk of stroke, heart failure, diabetes, and death ("all cause mortality").

Even though there are many magnesium rich foods, it is estimated that many people don't get enough magnesium in the diet, especially if they eat a lot of processed, low-fiber foods. Current Recommended Dietary Allowances (RDAs) are 320 mg daily for adult females and 420 mg daily for adult males (NIH magnesium fact-sheets - here and here). Especially good sources of magnesium are green leafy vegetables, legumes (beans), nuts, seeds, chocolate, and whole grains. In general, foods containing dietary fiber provide magnesium.From EurekAlert:

Dietary magnesium associated with reduced risk of heart disease, stroke and diabetes

A diet rich in magnesium may reduce the risk of diseases including coronary heart disease, stroke and type-2 diabetes according to a new meta-analysis published in the open access journal BMC Medicine. This analysis of the evidence on dietary magnesium and health outcomes is the largest to date, involving data from more than one million people across nine countries.

The researchers, from Zhejiang University and Zhengzhou University in China, found that people in the highest category of dietary magnesium consumption had a 10% lower risk of coronary heart disease, 12% lower risk of stroke and a 26% lower risk of type-2 diabetes compared to those in the lowest category. Their results also indicate that an extra 100 mg per day of dietary magnesium could also reduce risk of stroke by 7% and type-2 diabetes by 19%.

Magnesium is vital for human health and normal biological functions including glucose metabolism, protein production and synthesis of nucleic acids such as DNA. Diet is the main source of magnesium as the element can be found in foods such as spices, nuts, beans, cocoa, whole grains and green leafy vegetables.

Original study. from BMC Medicine: Dietary magnesium intake and the risk of cardiovascular disease, type 2 diabetes, and all-cause mortality: a dose–response meta-analysis of prospective cohort studies

Increasing dietary magnesium intake is associated with a reduced risk of stroke, heart failure, diabetes, and all-cause mortality, but not CHD [coronary heart disease] or total CVD [cardiovascular disease]. These findings support the notion that increasing dietary magnesium might provide health benefits....Magnesium is essential to all living organisms, as it controls the function of many crucial enzymes, including those that utilize or synthesize ATP ....

A large review of nut studies found that people eating a daily handful of nuts (about 20 g) have a lower risk of heart disease, cancer, stroke, premature death, and death from respiratory disease, type 2 diabetes, and infectious disease. Truly impressive. Benefits seem to be for all nuts, and also peanuts - which are called nuts, but are actually legumes (other posts about nut consumption benefits). An earlier post discussed how some of these effects could be to nuts lowering systemic inflammation throughout the body. Bottom line: try to eat a handful of nuts every day or most days a week for your health. And make it a variety of nuts - walnuts, almonds, hazelnuts, cashews, pistachios, pecans, Brazil nuts, and peanuts. From Science Daily:

A handful of nuts a day cuts the risk of a wide range of diseases

A large analysis of current research shows that people who eat at least 20g of nuts a day have a lower risk of heart disease, cancer and other diseases. The analysis of all current studies on nut consumption and disease risk has revealed that 20g a day -- equivalent to a handful -- can cut people's risk of coronary heart disease by nearly 30 percent, their risk of cancer by 15 percent, and their risk of premature death by 22 percent. An average of at least 20g of nut consumption was also associated with a reduced risk of dying from respiratory disease by about a half, and diabetes by nearly 40 percent, although the researchers note that there is less data about these diseases in relation to nut consumption.

The study, led by researchers from Imperial College London and the Norwegian University of Science and Technology, is published in the journal BMC Medicine. The research team analysed 29 published studies from around the world that involved up to 819,000 participants, including more than 12,000 cases of coronary heart disease, 9,000 cases of stroke, 18,000 cases of cardiovascular disease and cancer, and more than 85,000 deaths. While there was some variation between the populations that were studied....the researchers found that nut consumption was associated with a reduction in disease risk across most of them.

The study included all kinds of tree nuts, such as hazel nuts and walnuts, and also peanuts -- which are actually legumes. The results were in general similar whether total nut intake, tree nuts or peanuts were analysed. What makes nuts so potentially beneficial, said Aune, is their nutritional value: "Nuts and peanuts are high in fibre, magnesium, and polyunsaturated fats -- nutrients that are beneficial for cutting cardiovascular disease risk and which can reduce cholesterol levels. "Some nuts, particularly walnuts and pecan nuts are also high in antioxidants, which can fight oxidative stress and possibly reduce cancer risk. Even though nuts are quite high in fat, they are also high in fibre and protein, and there is some evidence that suggests nuts might actually reduce your risk of obesity over time."

The study also found that if people consumed on average more than 20g of nuts per day, there was little evidence of further improvement in health outcomes. [ORIGINAL STUDY]

  It turns out that scurvy and vitamin C deficiency is still around these days. Scurvy is a disease resulting from a lack of vitamin C. Most animals can synthesize vitamin C, but not humans. We must eat foods containing vitamin C to get the vitamin.

Vitamin C deficiency results in defective formation of collagen and connective tissues (in our bones, skin, tendons, muscles), and symptoms may include weakness, feeling tired, curly hair, sore arms and legs, bruising, bleeding gums, and impaired wound healing.

A recent small Australian study looked at diabetic persons with chronic foot wounds (foot ulcers that didn't heal after several months). Their vitamin C levels were tested and if found to be low, then they were given vitamin C supplements of 500 or 1000 mg daily, and the result was that within 2 to 3 weeks the wounds were healed. The one person with a zinc deficiency was given 50 mg daily of zinc supplement and that wound also promptly healed.

Treatment of scurvy is by taking vitamin C supplements (the Mayo Clinic recommends taking 400 to 1000 milligrams of vitamin C  daily for one week). Vitamin C deficiency can be easily prevented by a diet that includes fruits and vegetables. The recommended daily intake for adult women is 75 milligrams and for adult men it is 90 milligrams, which can be easily met by eating fruits and vegetables, especially if they are fresh (uncooked).

Good sources of vitamin C include: oranges, lemons, kiwi fruit, black currants, papaya, guava, pineapple, mango, strawberries, and vegetables such as bell peppers (red, yellow, green), tomatoes, potatoes, kale, brussels sprouts, and broccoli. It is possible to be vitamin C deficient even if the person is of normal weight or overweight - it all comes down to the diet and whether fruits and vegetables are eaten. Bottom line: Eat some daily!

From Medical Xpress: Poor diet sees scurvy reappear in Australia

Scurvy, a disease historically associated with old-world sailors on long voyages, is making a surprise comeback in Australia, with health officials Tuesday revealing a rare spate of cases. Caused by vitamin C deficiency, the condition used to be a common—and often fatal—curse among seafarers who went months without fresh fruit and vegetables.  ...continue reading "Poor Diets May Lead To Vitamin C Deficiency"

12

As you may have noticed, I write about the beneficial bacteria Lactobacillus sakei a lot. This is because it has turned out to be a great treatment for both chronic and acute sinusitis for my family and others (see post The One Probiotic That Treats Sinusitis). We originally found it in kimchi (it occurs in the kimchi during normal fermentation), but not all kimchi brands. Kimchi is a mix of vegetables (including typically cabbage) and seasonings, which is then fermented for days or weeks before it is eaten.

Why is L. sakei found in some kimchi, but not all? Which vegetable or spice is needed or important for encouraging L. sakei growth? It turns out it is not the cabbage - which is why L. sakei is not normally found in sauerkraut.

A recent study looking at several kimchi samples found that garlic seems to be important for the development of various Lactobacillus bacteria, of which L. sakei is one. The results mean that raw garlic has very low levels of L. sakei, and it multiplies during kimchi fermentation. Note that as fermentation progresses, the bacterial species composition in the kimchi changes (this is called ecological succession).

Korean studies (here and here) have consistently found L. sakei in many brands of kimchi (especially from about day 14 to about 2 or 2 1/2 months of fermentation), but not all kimchi brands or recipes. L.sakei, of which there are many strains, is so beneficial because it "outcompetes other spoilage- or disease-causing microorganisms" and so prevents them from growing (see post).

Excerpts are from the blog site Microbial Menagerie: MICROBES AT WORK IN YOUR KIMCHI

Cabbage is chopped up into large pieces and soaked in salt water allowing the water to draw out from the cabbage. Other seasonings such as spices, herbs and aromatics are prepared. Ginger, onion, garlic, and chili pepper are commonly used. The seasonings and cabbage are mixed together. Now the kimchi is ready to ferment. The mixture is packed down in a glass container and covered with the brining liquid if needed. The kimchi sits at room temperature for 1-2 days for fermentation to take place....Kimchi does not use a starter culture, but is still able to ferment. Then where do the fermentation microbes come from?

Phylogenetic analysis based on 16S rRNA sequencing indicates that the kimchi microbiome is dominated by lactic acid bacteria (LAB) of the genus Leuconostoc, Lactobacillus, and Weissella. Kimchi relies on the native microbes of the ingredients. That is, the microbes naturally found on the ingredients. Because of this, there may be wide variations in the taste and texture of the final kimchi product depending on the source of the ingredients. In fact, a research group from Chung-Ang University acquired the same ingredients from different markets and sampled the bacterial communities within each of the ingredients. The group found a wide variability in the same ingredient when it was bought from different markets. Surprisingly, the cabbage was not the primary source of LAB. Instead, Lactic acid bacteria was found in high abundance in the garlic samples

Note that Lactobacillus sakei is an example of a lactic acid bacteria. More study details from  the Journal of Food Science: Source Tracking and Succession of Kimchi Lactic Acid Bacteria during Fermentation.

This study aimed at evaluating raw materials as potential lactic acid bacteria (LAB) sources for kimchi fermentation and investigating LAB successions during fermentation. The bacterial abundances and communities of five different sets of raw materials were investigated using plate-counting and pyrosequencing. LAB were found to be highly abundant in all garlic samples, suggesting that garlic may be a major LAB source for kimchi fermentation. LAB were observed in three and two out of five ginger and leek samples, respectively, indicating that they can also be potential important LAB sources. LAB were identified in only one cabbage sample with low abundance, suggesting that cabbage may not be an important LAB source.

Bacterial successions during fermentation in the five kimchi samples were investigated by community analysis using pyrosequencing. LAB communities in initial kimchi were similar to the combined LAB communities of individual raw materials, suggesting that kimchi LAB were derived from their raw materials. LAB community analyses showed that species in the genera Leuconostoc, Lactobacillus, and Weissella were key players in kimchi fermentation, but their successions during fermentation varied with the species, indicating that members of the key genera may have different acid tolerance or growth competitiveness depending on their respective species.

Although W. koreensis, Leu. mesenteroides, and Lb. sakei were not detected in the raw materials of kimchi samples D and E (indicating their very low abundances in raw materials), they were found to be predominant during the late fermentation period. Several previous studies have also reported that W. koreensis, Leu. mesenteroides, and L. sakei are the predominant kimchi LAB during fermentation (Jeong and others 2013a, 2013b; Jung and others 2011, 2012, 2013a, 2014). 

An interesting study that showed that when gut microbes are deprived of dietary fiber (their food) they start to eat the natural layer of mucus that lines the colon. (The colon is part of the large intestine). This is important because the colon's mucus layer normally acts as a barrier to pathogenic microbes. Yes, it was done in mice, but the researchers feel that this study accurately models what also happens in humans. Their conclusion: when the microbes in the gut don't get enough dietary fiber from plants (such as whole grains, fruits, vegetables, seeds, nuts), then the microbes feed on the colon's mucus layer, which results in inflammation and makes the colon more vulnerable to pathogenic (disease causing) microbes. This is what some people refer to as "leaky gut".

Research shows that changes in the diet (high fiber vs low fiber) quickly results in changes in the gut microbes in humans and rodents - so it's important to consistently eat a lot of a variety of plant fiber. Currently the recommended daily fiber intake for adults is for 28 to 35 grams (chart of some high fiber foods). They found that some bacteria strains flourished the best in low or no fiber conditions and it was these bacteria that were involved in breaking down the mucus layer. The research also showed that what are called "prebiotics" (purified forms of soluble fiber similar to what some processed foods and supplements contain) also resulted in thinning of the colon's mucus layer - they did not properly feed the gut microbes. From Medical Xpress:

High-fiber diet keeps gut microbes from eating colon's lining, protects against infection

It sounds like the plot of a 1950s science fiction movie: normal, helpful bacteria that begin to eat their host from within, because they don't get what they want. But new research shows that's exactly what happens when microbes inside the digestive system don't get the natural fiber that they rely on for food. Starved, they begin to munch on the natural layer of mucus that lines the gut, eroding it to the point where dangerous invading bacteria can infect the colon wall. In a new paper in Cell, an international team of researchers show the impact of fiber deprivation on the guts of specially raised mice. The mice were born and raised with no gut microbes of their own, then received a transplant of 14 bacteria that normally grow in the human gut. 

The findings have implications for understanding not only the role of fiber in a normal diet, but also the potential of using fiber to counter the effects of digestive tract disorders. "The lesson we're learning from studying the interaction of fiber, gut microbes and the intestinal barrier system is that if you don't feed them, they can eat you," says Eric Martens, Ph.D., an associate professor of microbiology at the University of Michigan Medical School....Using U-M's special gnotobiotic, or germ-free, mouse facility, and advanced genetic techniques that allowed them to determine which bacteria were present and active under different conditions, they studied the impact of diets with different fiber content - and those with no fiber. They also infected some of the mice with a bacterial strain that does to mice what certain strains of Escherichia coli can do to humans - cause gut infections that lead to irritation, inflammation, diarrhea and more.

The result: the mucus layer stayed thick, and the infection didn't take full hold, in mice that received a diet that was about 15 percent fiber from minimally processed grains and plants. But when the researchers substituted a diet with no fiber in it, even for a few days, some of the microbes in their guts began to munch on the mucus.They also tried a diet that was rich in prebiotic fiber - purified forms of soluble fiber similar to what some processed foods and supplements currently contain. This diet resulted in the same erosion of the mucus layer as observed in the lack of fiber.

The researchers also saw that the mix of bacteria changed depending on what the mice were being fed, even day by day. Some species of bacteria in the transplanted microbiome were more common - meaning they had reproduced more - in low-fiber conditions, others in high-fiber conditions. And the four bacteria strains that flourished most in low-fiber and no-fiber conditions were the only ones that make enzymes that are capable of breaking down the long molecules called glycoproteins that make up the mucus layer....  Just like the mix of bacteria, the mix of enzymes changed depending on what the mice were being fed, with even occasional fiber deprivation leading to more production of mucus-degrading enzymes.

Images of the mucus layer, and the "goblet" cells of the colon wall that produce the mucus constantly, showed the layer was thinner the less fiber the mice received. While mucus is constantly being produced and degraded in a normal gut, the change in bacteria activity under the lowest-fiber conditions meant that the pace of eating was faster than the pace of production - almost like an overzealous harvesting of trees outpacing the planting of new ones. 

When the researchers infected the mice with Citrobacter rodentium - the E. coli-like bacteria - they observed that these dangerous bacteria flourished more in the guts of mice fed a fiber-free diet. Many of those mice began to show signs of illness and lost weight. When the scientists looked at samples of their gut tissue, they saw not only a much thinner or even patchy mucus later - they also saw inflammation across a wide area. Mice that had received a fiber-rich diet before being infected also had some inflammation but across a much smaller area. [Original study]

A thick mucus layer (green), generated by the cells of the colon's wall, provides protection against invading bacteria and other pathogens. This image of a mouse's colon shows the mucus (green) acting as a barrier for the "goblet" cells (blue) that produce it. Credit: University of Michigan

This study found impressive results - improvement in autistic behaviors in children diagnosed with autism spectrum disorder (ASD) with four months of daily vitamin D supplementation. Children in the placebo group did not show improvement. A nice aspect of the study was that the children were randomly assigned  to a placebo or a vitamin D group (so that the groups were not self-selected) and it was double-blinded (so no one knew who was getting the vitamins - again to prevent bias). This was a preliminary study - meaning more studies are needed, but it would be amazing if these results hold up... From Science Daily:

Vitamin D supplements may benefit children with autism spectrum disorder

Studies have shown an association between the risk of autism spectrum disorder and vitamin D insufficiency. In this latest study, 109 children with autism spectrum disorder were randomized to receive four months of vitamin D3 supplementation or a placebo."Autism symptoms -- such as hyperactivity, social withdrawal, and others -- improved significantly following vitamin D3 supplementation but not after receiving placebo," said Dr. Khaled Saad, lead author of the Journal of Child Psychology and Psychiatry study.

Excerpts from the original study from  The Journal of Child Psychology and Psychiatry: Randomized controlled trial of vitamin D supplementation in children with autism spectrum disorder

Autism spectrum disorder (ASD) is a frequent developmental disorder characterized by pervasive deficits in social interaction, impairment in verbal and nonverbal communication, and stereotyped patterns of interests and activities. It has been previously reported that there is vitamin D deficiency in autistic children; however, there is a lack of randomized controlled trials of vitamin D supplementation in ASD children.

This study is a double-blinded, randomized clinical trial (RCT) that was conducted on 109 children with ASD (85 boys and 24 girls; aged 3–10 years). The aim of this study was to assess the effects of vitamin D supplementation on the core symptoms of autism in children. ASD patients were randomized to receive vitamin D3 or placebo for 4 months. The serum levels of 25-hydroxycholecalciferol (25 (OH)D) were measured at the beginning and at the end of the study. The autism severity and social maturity of the children were assessed by the Childhood Autism Rating Scale (CARS), .... 
Supplementation of vitamin D was well tolerated by the ASD children. The daily doses used in the therapy group was 300 IU vitamin D3/kg/day, not to exceed 5,000 IU/day. The autism symptoms of the children improved significantly, following 4-month vitamin D3 supplementation, but not in the placebo group. This study demonstrates the efficacy and tolerability of high doses of vitamin D3 in children with ASD.   

Recently, Wang et al. (2016) performed a systematic review and meta-analysis of all studies on serum concentration of 25 (OH)D in ASD (Wang et al., 2016). Eleven studies were included, accounting for a total of 870 ASD patients and 782 healthy controls. Serum levels of 25 (OH)D in participants with ASD were significantly lower than those in controls. They concluded that low vitamin D might serve as a risk factor for autism spectrum disorder (Wang et al., 2016). 

In a recent survey, our research group measured 25 (OH)D in 122 ASD children (3–9 years old) and 100 healthy children as controls (Saad, Abdel-Rahman, et al., 2015). The ASD group showed a significantly lower level of serum 25 (OH)D compared with the control group (p < .0001). The study found highly significant inverse correlations between serum 25 (OH)D levels and autism rating scales. In the second part of the previous study (Saad, AbdelRahman, et al., 2015), an open-label trial of 83 subjects who completed a 3-month therapy with high daily doses of vitamin D (300 IU/kg/day) was performed. Collectively, 80.7% of the children with ASD had significantly improved outcome, which was mainly in the sections of the CARS and ABC subscales that measure behavior, stereotypy, eye contact, and attention span (Saad, Abdel-Rahman, et al., 2015). 

Guidelines for how to prevent food allergies in children are changing. Until very recently, it was avoid, avoid, avoid exposing babies or young children to any potential allergens. Remember parents being advised that if an allergy to X (whether pets or food) runs in the family, then absolutely avoid exposing the child to the potential allergen? Well, recent research (herehere, and here) found that the opposite is true - that in the first year of life the baby should be exposed to potential allergens (whether animals or food) which stimulates the child's developing immune system in beneficial ways.

Physicians at a recent conference of allergists said that evidence shows that allergenic foods — including peanuts, eggs, and milk — should be introduced in the first year of life. The new 2017 medical guidelines will recommend introducing small amounts of peanuts (mixed in with other foods), when children are 4 to 6 months of age..

About two years ago a landmark study (LEAP study) found that when infants at a high risk of developing peanut allergy consumed peanuts on a regular basis, their risk of peanut allergy was dramatically reduced. And the opposite was also true: peanut avoidance in the first year of life was associated with a greater frequency of peanut allergy. Which made doctors start to rethink their strategies of how to avoid food allergies. From Medscape:

Allergenic Foods Should Be Introduced to Infants Early

Although the evidence shows that allergenic foods — including peanuts, eggs, and milk — should be introduced in the first year of life, guidelines are lagging behind, said an allergist speaking here at the American College of Allergy, Asthma & Immunology (ACAAI) 2016 Annual Scientific Meeting. Official guidelines to be issued early in 2017 will address only peanuts, recommending introduction when children are 4 to 6 months of age.

"There is now a large body of observation and trial data for other foods, including egg, that show that delaying the introduction of allergenic solids increases the risk of those particular food allergies," said Katrina Allen, MBBS, PhD, from the Murdoch Childrens Research Institute in Melbourne, Australia. Policy changes are needed to help guide parents' decisions, she said. In fact, there is evidence showing that changes to policy — namely, infant-feeding guidelines — mirror the rise in the incidence of food allergies.

Not everyone agrees on exposure amount and timing in the case of egg allergy. In a recent trial, researchers looked at the early introduction of allergenic foods in breast-fed children (N Engl J Med. 2016;374:1733-1743). The prevalence of any food allergy was significantly lower in the early-introduction group than in the standard-introduction group, as was the prevalence of peanut allergy and egg allergy. And a study Dr Allen was involved in, which introduced cooked egg in small amounts, showed that early introduction reduced allergy (J Allergy Clin Immunol. 2010;126:807-813).

However, in a German study, where greater amounts of egg were introduced at 4 to 6 months, early exposure increased the risk for life-threatening allergic reactions (J Allergy Clin Immunol. Published online August 12, 2016). And in the STEP study, there was no change in the number of food allergies in 1-year-old children when egg was introduced early (J Allergy Clin Immunol. Published online August 20, 2016). However, that did not take into account high-risk infants, particularly those with eczema, who are known to have a higher incidence of egg allergy and are likely to see a much greater benefit from the early introduction of egg.

The new peanut guidelines — coauthored by Amal Assa'ad, MD, from the Cincinnati Children's Hospital, who is chair of the ACAAI food allergy committee — will recommend that children with no eczema or egg allergy can be introduced to peanut-containing foods at home, according to the family's preference. And for children with mild to moderate eczema who have already started solid foods, the guidelines say that peanut-containing foods can be introduced at home at around 6 months of age, without the need for an evaluation. However, the guidelines caution, peanut-containing foods should not be the first solid food an infant tries, and an introduction should be made only when the child is healthy. The first feeding should not happen when the child has a cold, is vomiting, or has diarrhea or another illness.

For eggs, there is no official recommendation as of yet....The early introduction of allergenic foods is not the only policy that needs to be changed to lower the incidence of food allergies, Dr Allen told Medscape Medical News. Other factors, particularly environmental factors — mostly written up in observational studies — are contributing to an increasing intolerance to allergenic foods. Policies advocating that kids "get down and dirty," have more exposure to dogs, and bathe less are also warranted....Dr Allen and Dr Assa'ad agree that delaying the introduction of foods such cow's milk and egg until after 12 months is harmful. Guidelines should encourage families to introduce these foods in the first year of life, once solids have commenced at around 6 months, but not before 4 months.

An article points out what we should all be concerned with, but is being ignored - pesticide residues of glyphosate (found in Monsanto's Roundup) and 2,4-D in foods. Glyphosate is the most used pesticide in the world, and both glyphosate and 2,4-D pesticide residues in food are set to really increase with the introduction of genetically modified crops resistant to both pesticides (The farmers can spray the pesticides against weeds repeatedly and the crops won't die - they're resistant to the pesticides, so the crops and foods will contain pesticide residues.). Both pesticides are linked to a variety of health problems.

The FDA, under pressure, started analyzing some foods this year for glyphosate residues, but has already stopped due to "problems". When and if they resume is unknown. It is important to know that pesticide residues at varying levels have been found in many foods (here, here, and here). What this chronic low level exposure does to humans is unknown. The only way to avoid or minimize pesticide exposures is to eat organic foods (herehere, and here).

When reading the following excerpts, note that there are no standards for glyphosate residues in honey in the United States. Research by a FDA chemist and a chemist at the University of Iowa found residues of glyphosate at varying levels with a high of 653 parts per billion in one sample (in honey from Iowa) - which is more than 10 times the limit of 50 ppb allowed in the European Union. Other honey samples tested detected glyphosate residues from under 16 ppb to over 123 parts per billion ppb (in honey from Louisiana). I mention this because pesticide residues are an important issue - because we don't know what chronic exposure to mixtures of low levels of pesticides in foods does to us. To babies and children, to pregnant women, to the elderly, to all of us. The following article excerpts are by journalist Carey Gillam, from The Huffington Post:

FDA Suspends Testing for Glyphosate Residues in Food

Government testing for residues of an herbicide that has been linked to cancer has been put on hold, slowing the Food and Drug Administration’s first-ever endeavor to get a handle on just how much of the controversial chemical is making its way into U.S. foods. The FDA, the nation’s chief food safety regulator, launched what it calls a “special assignment” earlier this year to analyze certain foods for residues of the weed killer called glyphosate after the agency was criticized by the U.S. Government Accountability Office for failing to include glyphosate in annual testing programs that look for many less-used pesticides. Glyphosate is the most widely used herbicide in the world, and is the key ingredient in Monsanto Co.’s branded Roundup herbicide line.

Glyphosate is under particular scrutiny now after the World Health Organization’s cancer experts last year declared the chemical a probable human carcinogen. Several private groups and nonprofits have been doing their own testing, and have been finding glyphosate residues in varying levels in a range of foods, raising consumer concerns about the pesticide’s presence in the American diet.

The FDA’s residue testing for glyphosate was combined with a broader herbicides analysis program the FDA set in motion in February of this year. But the glyphosate testing has been particularly challenging for the FDA. The agency was finally forced to put the glyphosate residue testing part of the work plan on hold amid confusion, disagreement and difficulties ....FDA spokeswoman Megan McSeveney confirmed the testing suspension and said the agency is not sure when it will resume.....Alongside the testing for glyphosate, the FDA laboratories have also been analyzing foods for 2,4-D and other “acid herbicides,” documents obtained from the FDA show. The category of acid herbicides includes five of the top 10 active ingredients used in homes and gardens. Usage of 2,4-D is expected to triple in the coming year, according to the FDA.

McSeveney said glyphosate residues were only being analyzed in soy, corn, milk and eggs and the popcorn samples, while the other foods are being tested for residues of other herbicides. Earlier this year, one of the agency’s senior chemists also analyzed glyphosate residues in honey and oatmeal and reported his results to the agency. Some honey samples contained residue levels well over the limit allowed in the European Union. The United States has no legal tolerance for glyphosate in honey, though the Environmental Protection Agency (EPA) said recently it may set one because of the FDA findings. 

With the testing on hold, it is not clear when the agency might have final results on the glyphosate residue analysis. McSeveney said preliminary results showed no violations of legal tolerance levels allowed for glyphosate in the foods tested. She did not provide details on what, if any, levels of residue were found. Tolerance levels are set by the EPA for a variety of pesticides expected to be found in foods. When residue levels are detected above the tolerance levels, enforcement action can be taken against the food producers.

Though FDA annually tests domestic and imported foods for residues of other pesticides, it never tested for glyphosate before. It has not routinely tested for 2,4-D either, a fact also criticized by the GAO. The FDA testing for 2,4-D residues comes as the use of 2,4-D with food crops is expected to start rising due to the commercialization of new formulated herbicide products that combine glyphosate and 2,4-D. Safety questions have been raised about the combination. But the EPA gave a green light on Nov. 1 to a Dow AgroSciences’ herbicide combination of glyphosate and 2,4-D. The new products are intended to counter widespread weed resistance to glyphosate, and be used with new types of genetically engineered herbicide-tolerant crops.