Skip to content

Another article reporting on the Crohn's disease study I posted yesterday. But this article lists the depleted bacteria and also which ones there are too much of in Crohn's patients. It illustrates that gut microbial communities being out of whack go hand in hand with disease. Remember: dysbiosis means an imbalance in the microbial populations. Interestingly, just like in the 2012 sinusitis study (see my December 4, 2013 post) - it's biopsies that found the specific bacterial imbalances, and not fecal samples or mucus/phlegm swabs (typically done in sinusitis). Big step forward in human microbiome research. And again antibiotics are not the answer.

From Science magazine:

Crohn's Disease Marked by Dramatic Changes in Gut Bacteria

The largest clinical study of its kind is revealing new insights into the causes of Crohn's disease, a periodic inflammation of the intestines. The research, which involved 668 children, shows that numbers of some beneficial bacteria in the gut decrease in Crohn's patients, while the number of potentially harmful bacteria increases. The study could lead to new, less invasive diagnostic tests; it also shows that antibiotics—which aren't recommended for Crohn's but are often given when patients first present with symptoms—may actually make the disease worse.

Crohn’s disease is one of the two major inflammatory bowel diseases (IBDs); the other is ulcerative colitis, a similar condition that affects only the colon. Both have been on the rise in the developing world since the early 1950s; now, an estimated 1.4 million people suffer from IBD in the United States alone. Symptoms include diarrhea, abdominal pains and cramping, and intestinal ulcers.

But genes alone can't explain the sharp rise in IBD incidence, and scientists have looked at the environment—in particular diet and antibiotic use—for answers.

Several studies have shown that Crohn’s disease is characterized by microbial dysbiosis, a shift in the microbial populations inhabiting the gut, but it's difficult to unravel cause and effect: A change in gut microbiota can cause inflammation, but the reverse can also occur. Complicating the picture is the fact that before being diagnosed with IBD, patients often receive antibiotics to fend off a supposed gut infection that could be causing the symptoms, which also have a powerful impact on the microbial populations living in our guts.

Now, a group headed by Ramnik Xavier, a gastroenterologist at Harvard Medical School in Boston, has collected fecal samples and taken biopsies of the lower part of the small intestine and rectum from 447 children who had just been diagnosed with Crohn's, and a control group of 221 kids who had noninflammatory abdominal symptoms, such as bloating and diarrhea. In contrast with previous studies, the majority of patients had not yet received antibiotics or anti-inflammatory drugs. Based on their genetic material, the researchers determined the relative abundance of a range of microbial species in the samples.

Some potentially harmful microbial species were more abundant in Crohn's patients, such as those belonging to the Enterobacteriaceae, Pasteurellaceae, Veillonellaceae, and Fusobacteriaceae; numbers of the ErysipelotrichalesBacteroidales, and Clostridiales, generally considered to be beneficial, were lower. The disappearance and appearance of species can be equally important, says Dirk Gevers of the Broad Institute in Cambridge, Massachusetts, who performed most of the work. "There has been a shift in the ecosystem, which affects both types.”

But those differences were found mostly in the biopsy samples; there weren't many differences between the feces from Crohn's patients and the control group. At this early stage of the disease, "the dysbiosis seems not to have reached the stool yet," Gevers says.

The dysbiosis was also more pronounced in patients who had received antibiotics. "This study confirms that these drugs don’t do any good to people with Crohn’s disease," says gastroenterologist Séverine Vermeire of the Catholic University of Leuven in Belgium, who was not involved in the study. "We knew antibiotic use increases the risk to develop the disease; now we know they can worsen it, too."

Vermeire says it's a "missed opportunity" that the researchers didn't look at the patients' diets. "That could have helped elucidate why this disease occurs so much more in the Western world than elsewhere." In 2011, Vermeire’s group published a study showing that healthy family members of Crohn's disease patients have a slight dysbiosis as well. Vermeire is convinced that even in these families, it's not genetics but some lifestyle factor that causes the phenomenon. "If we could identify the dysbiosis in an early stage, and we knew the causative factors,” she says, “we could prevent disease occurrence by bringing about lifestyle changes.”

Two related studies showing the importance of the intestinal bacterial community for health and preventing diseases. Both also discuss how antibiotics disrupt the gut microbial community. From Science Daily:

Microbes help to battle infection: Gut microbes help develop immune cells, study finds

The human relationship with microbial life is complicated. Although there are types of bacteria that can make us sick, Caltech professor of biology and biological engineering Sarkis Mazmanian and his team are most interested in the thousands of other bacteria -- many already living inside our bodies -- that actually keep us healthy. Now, he and his team have found that these good bugs might also prepare the immune cells in our blood to fight infections from harmful bacteria.

In the recent study, published on March 12 in the journal Cell Host & Microbe, the researchers found that beneficial gut bacteria were necessary for the development of innate immune cells -- specialized types of white blood cells that serve as the body's first line of defense against invading pathogens.

In addition to circulating in the blood, reserve stores of immune cells are also kept in the spleen and in the bone marrow. When the researchers looked at the immune cell populations in these areas in so-called germ-free mice, born without gut bacteria, and in healthy mice with a normal population of microbes in the gut, they found that germ-free mice had fewer immune cells -- specifically macrophages, monocytes, and neutrophils -- than healthy mice. Germ-free mice also had fewer granulocyte and monocyte progenitor cells, stemlike cells that can eventually differentiate into a few types of mature immune cells

Khosravi and his colleagues next wanted to see if the reduction in immune cells in the blood would make the germ-free mice less able to fight off an infection by the harmful bacterium Listeria monocytogenes -- a well-studied human pathogen often used to study immune responses in mice. While the healthy mice were able to bounce back after being injected with Listeria, the infection was fatal to germ-free mice. When gut microbes that would normally be present were introduced into germ-free mice, the immune cell population increased and the mice were able to survive the Listeria infection.

The researchers also gave injections of Listeria to healthy mice after those mice were dosed with broad-spectrum antibiotics that killed off both harmful and beneficial bacteria. Interestingly, these mice also had trouble fighting the Listeria infection. "We didn't look at clinical data in this study, but we hypothesize that this might also happen in the clinic," says Mazmanian. "For example, when patients are put on antibiotics for something like hip surgery, are you damaging their gut microbe population and making them more susceptible to an infection that had nothing to do with their hip surgery?"

More importantly, the research also suggests that a healthy population of gut microbes can actually provide a preventative alternative to antibiotics, Khosravi says. 

From Science Daily:

Large study identifies exact gut bacteria involved in Crohn's disease

While the causes of Crohn's disease are not well understood, recent research indicates an important role for an abnormal immune response to the microbes that live in the gut. In the largest study of its kind, researchers have now identified specific bacteria that are abnormally increased or decreased when Crohn's disease develops. The findings, which appear in the March 12 issue of the Cell Press journal Cell Host & Microbe, suggest which microbial metabolites could be targeted to treat patients with this chronic and currently incurable inflammatory bowel disease.

Twenty-eight gastroenterology centers across North America have been working together to uncover how microbes contribute to the inflammatory cascade of Crohn's disease. Researchers took biopsies from 447 individuals with new-onset Crohn's disease and 221 nonaffected individuals at multiple locations along the gastrointestinal tract and then looked for differences between the two groups. They also validated their methods in additional individuals, resulting in a total of 1,742 samples from pediatric and adult patients with either new-onset or established disease.

The team found that microbial balance was disrupted in patients with Crohn's disease, with beneficial microbes missing and pathological ones flourishing. Having more of the disease-associated organisms correlated with increasing clinical disease activity. 

When the researchers analyzed the effects of antibiotics, which are sometimes used to treat Crohn's disease symptoms prior to diagnosis, they found that antibiotic usage in children with Crohn's disease could be counterproductive because it causes a loss of good microbes and an increase in pathological ones.

Head lice is a big, big concern for parents of young school age children. From Science Daily:

Ordinary conditioner removes head lice eggs as effectively as special products

Eggs from head lice, also called nits, are incredibly difficult to remove. Female lice lay eggs directly onto strands of hair, and they cement them in place with a glue-like substance, making them hard to get rid of. In fact, the eggs are glued down so strongly that they will stay in place even after hair has been treated with pediculicides -- substances used to kill lice.

Some shampoos and conditioners that contain chemicals or special oils are marketed as nit-removal products. However, new research just published in the Journal of Medical Entomology shows that ordinary hair conditioner is just as effective.

In an article called "Efficacy of Products to Remove Eggs of Pediculus humanus capitis (Phthiraptera: Pediculidae) From the Human Hair," scientists from Belgium gathered 605 hairs from six different children. Each hair had a single nit attached to it. Approximately 14% of the eggshells contained a dead egg, whereas the rest were empty.

They then tried to remove the eggs and tested the amount of force needed to do so. They found that nits on the hairs that were left completely untreated were the most difficult to remove. Eggs on hairs that had been soaked in deionized water were much easier to remove, as were the eggs on hairs that had been treated with ordinary hair conditioner and with products specifically marketed for the purpose of nit removal.

However, they found no significant differences between the ordinary conditioners and the special nit-removal products. In all cases, less force was required to remove the nits after the hair had been treated, but the effectiveness of the products was essentially the same.

The authors hypothesize that the deionized water was effective because it acts as a lubricant, so less friction is needed to remove the nits from the hairs. The same goes for the conditioners.

From Science Daily:

Dropped your toast? Five-second food rule exists, new research suggests

Food picked up just a few seconds after being dropped is less likely to contain bacteria than if it is left for longer periods of time, according to the findings of research carried out at Aston University's School of Life and Health Sciences.

The findings suggest there may be some scientific basis to the '5 second rule' -- the urban myth about it being fine to eat food that has only had contact with the floor for five seconds or less. Although people have long followed the 5 second rule, until now it was unclear whether it actually helped.

The study, undertaken by final year Biology students and led by Anthony Hilton, Professor of Microbiology at Aston University, monitored the transfer of the common bacteria Escherichia coli (E. coli) and Staphylococcus aureus from a variety of indoor floor types (carpet, laminate and tiled surfaces) to toast, pasta, biscuit and a sticky sweet when contact was made from 3 to 30 seconds.

The results showed that:  - Time is a significant factor in the transfer of bacteria from a floor surface to a piece of food; and  - The type of flooring the food has been dropped on has an effect, with bacteria least likely to transfer from carpeted surfaces and most likely to transfer from laminate or tiled surfaces to moist foods making contact for more than 5 seconds.

Professor Hilton said: "Consuming food dropped on the floor still carries an infection risk as it very much depends on which bacteria are present on the floor at the time; however the findings of this study will bring some light relief to those who have been employing the five-second rule for years, despite a general consensus that it is purely a myth. We have found evidence that transfer from indoor flooring surfaces is incredibly poor with carpet actually posing the lowest risk of bacterial transfer onto dropped food.

Two studies about Vitamin D. From Science Daily:

Vitamin D increases breast cancer patient survival, study shows

Breast cancer patients with high levels of vitamin D in their blood are twice as likely to survive the disease as women with low levels of this nutrient, report University of California, San Diego School of Medicine researchers in the March issue of Anticancer Research.

In previous studies, Cedric F. Garland, DrPH, professor in the Department of Family and Preventive Medicine, showed that low vitamin D levels were linked to a high risk of premenopausal breast cancer. That finding, he said, prompted him to question the relationship between 25-hydroxyvitamin D -- a metabolite produced by the body from the ingestion of vitamin D -- and breast cancer survival rates.

Garland and colleagues performed a statistical analysis of five studies of 25-hydroxyvitamin D obtained at the time of patient diagnosis and their follow-up for an average of nine years. Combined, the studies included 4,443 breast cancer patients.

Women in the high serum group had an average level of 30 nanograms per milliliter (ng/ml) of 25-hydroxyvitamin D in their blood. The low group averaged 17 ng/ml. The average level in patients with breast cancer in the United States is 17 ng/ml.

A 2011 meta-analysis by Garland and colleagues estimated that a serum level of 50 ng/ml is associated with 50 percent lower risk of breast cancer. While there are some variations in absorption, those who consume 4,000 International Units (IU) per day of vitamin D from food or a supplement normally would reach a serum level of 50 ng/ml. 

From Science Daily:

Vitamin D deficiency may compromise immune function

Older individuals who are vitamin D deficient also tend to have compromised immune function, according to new research accepted for publication in the Endocrine Society's Journal of Clinical Endocrinology & Metabolism (JCEM).

"Our data suggest vitamin D may be involved in maintaining the health of the immune system as well as the skeletal system," said one of the study's authors, Mary Ward, PhD, of the University of Ulster in Coleraine, U.K.

The observational study of 957 Irish adults who were at least 60 years old examined vitamin D levels as well as biomarkers of inflammation. Participants who were vitamin D deficient were more likely to have high levels of these biomarkers, which are linked to cardiovascular disease and inflammatory conditions such as multiple sclerosis and rheumatoid arthritis.

From Science Daily:

Meat and cheese may be as bad for you as smoking

That chicken wing you're eating could be as deadly as a cigarette. In a new study that tracked a large sample of adults for nearly two decades, researchers have found that eating a diet rich in animal proteins during middle age makes you four times more likely to die of cancer than someone with a low-protein diet -- a mortality risk factor comparable to smoking.

"There's a misconception that because we all eat, understanding nutrition is simple. But the question is not whether a certain diet allows you to do well for three days, but can it help you survive to be 100?" said corresponding author Valter Longo, the Edna M. Jones Professor of Biogerontology at the USC Davis School of Gerontology and director of the USC Longevity Institute.

Not only is excessive protein consumption linked to a dramatic rise in cancer mortality, but middle-aged people who eat lots of proteins from animal sources -- including meat, milk and cheese -- are also more susceptible to early death in general, reveals the study to be published March 4 in Cell Metabolism. Protein-lovers were 74 percent more likely to die of any cause within the study period than their more low-protein counterparts. They were also several times more likely to die of diabetes.

But how much protein we should eat has long been a controversial topic -- muddled by the popularity of protein-heavy diets such as Paleo and Atkins. Before this study, researchers had never shown a definitive correlation between high protein consumption and mortality risk.

Rather than look at adulthood as one monolithic phase of life, as other researchers have done, the latest study considers how biology changes as we age, and how decisions in middle life may play out across the human lifespan.

In other words, what's good for you at one age may be damaging at another. Protein controls the growth hormone IGF-I, which helps our bodies grow but has been linked to cancer susceptibility. Levels of IGF-I drop off dramatically after age 65, leading to potential frailty and muscle loss. The study shows that while high protein intake during middle age is very harmful, it is protective for older adults: those over 65 who ate a moderate- or high-protein diet were less susceptible to disease.

"The research shows that a low-protein diet in middle age is useful for preventing cancer and overall mortality, through a process that involves regulating IGF-I and possibly insulin levels," said co-author Eileen Crimmins, the AARP Chair in Gerontology at USC. "However, we also propose that at older ages, it may be important to avoid a low-protein diet to allow the maintenance of healthy weight and protection from frailty."

Crucially, the researchers found that plant-based proteins, such as those from beans, did not seem to have the same mortality effects as animal proteins. Rates of cancer and death also did not seem to be affected by controlling for carbohydrate or fat consumption, suggesting that animal protein is the main culprit.

People who ate a moderate amount of protein were still three times more likely to die of cancer than those who ate a low-protein diet in middle age, the study shows. Overall, even the small change of decreasing protein intake from moderate levels to low levels reduced likelihood of early death by 21 percent.

 

From Medscape:

Allergic Rhinitis Patients Live Longer

Their runny noses might drive them crazy, but people with allergic rhinitis are likely to outlive the rest of us, a new study suggests.

"We found that allergic rhinitis patients had a decreased risk of heart attack, a decreased risk of stroke and, most strikingly, a decreased risk of all-cause mortality," said lead investigator Angelina Crans Yoon, MD, from the Department of Allergy and Clinical Immunology at the Kaiser Permanente Los Angeles Medical Center.

"They were basically half as likely to die during the study period," she told Medscape Medical News. 

Researchers studying data from the National Health and Nutrition Examination Survey (NHANES) found that people who tested positive for allergies were less likely to suffer cardiovascular events.

To explore the issue further, Dr. Crans Yoon and her team looked at a database of Southern California patients.The cohort consisted of 109,229 patients with allergic rhinitis and 109,229 people without allergic rhinitis who were matched for age, sex, and ethnicity. It also consisted of 92,775 patients with asthma who were matched with a similar group without asthma.

Risk for acute myocardial infarction was 25% lower in patients with allergic rhinitis than in those without, risk for a cerebrovascular event was 19% lower, and risk for all-cause mortality was 49% lower. Risk for all cardiovascular events was similar in the allergic rhinitis and control groups.

In contrast, risk for all cardiovascular events was 36% higher in patients with asthma than in those without, whereas risk for cerebrovascular disease and all-cause mortality were similar.

This could be the result of a difference in phenotypes in asthma patients, said Dr. Crans Yoon. People whose asthma is caused by allergies could be at less risk for cardiovascular events than people whose asthma has other causes.

Why should allergic rhinitis decrease someone's risk for death? 

Another explanation could be that the immune systems of patients with allergic rhinitis are hyperalert, aggressively fighting off disease, as well as causing symptoms, when it is not necessary. More work is needed to evaluate that.

Keep your eye on future nanosilver research. From Science Daily:

More dangerous chemicals in everyday life: Now experts warn against nanosilver

Endocrine disrupters are not the only worrying chemicals that ordinary consumers are exposed to in everyday life. Also nanoparticles of silver, found in e.g. dietary supplements, cosmetics and food packaging, now worry scientists. A new study from the University of Southern Denmark shows that nano-silver can penetrate our cells and cause damage.

Silver has an antibacterial effect and therefore the food and cosmetic industry often coat their products with silver nanoparticles. Nano-silver can be found in e.g. drinking bottles, cosmetics, band aids, toothbrushes, running socks, refrigerators, washing machines and food packagings.

"Silver as a metal does not pose any danger, but when you break it down to nano-sizes, the particles become small enough to penetrate a cell wall. If nano-silver enters a human cell, it can cause changes in the cell," explain Associate Professor Frank Kjeldsen and PhD Thiago Verano-Braga, Department of Biochemistry and Molecular Biology at the University of Southern Denmark.

The researchers examined human intestinal cells, as they consider these to be most likely to come into contact with nano-silver, ingested with food.

"We can confirm that nano-silver leads to the formation of harmful, so called free radicals in cells. We can also see that there are changes in the form and amount of proteins. This worries us," say Frank Kjeldsen and Thiago Verano-Braga.

A large number of serious diseases are characterized by the fact that there is an overproduction of free radicals in cells. This applies to cancer and neurological diseases such as Alzheimer's and Parkinson's.

Kjeldsen and Verano-Braga emphasizes that their research is conducted on human cells in a laboratory, not based on living people. They also point out that they do not know how large a dose of nano-silver, a person must be exposed to for the emergence of cellular changes.

"We don't know how much is needed, so we cannot conclude that nano-silver can make you sick. But we can say that we must be very cautious and worried when we see an overproduction of free radicals in human cells," they say.

Nano-silver is also sold as a dietary supplement, promising to have an antibacterial, anti-flu and cancer-inhibitory effect. The nano-silver should also help against low blood counts and bad skin. In the EU, the marketing of dietary supplements and foods with claims to have medical effects is not allowed. But the nano-silver is easy to find and buy online.

In the wake of the Uiversity of Southern Denmark-research, the Danish Veterinary and Food Administration now warns against taking dietary supplements with nano-silver.

Please note that 70 grams equals 2 1/2 ounces of chocolate. From Science Daily:

Why dark chocolate is good for your heart

It might seem too good to be true, but dark chocolate is good for you and scientists now know why. Dark chocolate helps restore flexibility to arteries while also preventing white blood cells from sticking to the walls of blood vessels. Both arterial stiffness and white blood cell adhesion are known factors that play a significant role in atherosclerosis. What's more, the scientists also found that increasing the flavanol content of dark chocolate did not change this effect. This discovery was published in the March 2014 issue of The FASEB Journal.

"We provide a more complete picture of the impact of chocolate consumption in vascular health and show that increasing flavanol content has no added beneficial effect on vascular health," said Diederik Esser, Ph.D., a researcher involved in the work from the Top Institute Food and Nutrition and Wageningen University, Division of Human Nutrition in Wageningen, The Netherlands. "However, this increased flavanol content clearly affected taste and thereby the motivation to eat these chocolates. So the dark side of chocolate is a healthy one."

To make this discovery, Esser and colleagues analyzed 44 middle-aged overweight men over two periods of four weeks as they consumed 70 grams of chocolate per day. Study participants received either specially produced dark chocolate with high flavanol content or chocolate that was regularly produced. Both chocolates had a similar cocoa mass content. Before and after both intervention periods, researchers performed a variety of measurements that are important indicators of vascular health. During the study, participants were advised to refrain from certain energy dense food products to prevent weight gain. Scientists also evaluated the sensory properties of the high flavanol chocolate and the regular chocolate and collected the motivation scores of the participants to eat these chocolates during the intervention.

"The effect that dark chocolate has on our bodies is encouraging not only because it allows us to indulge with less guilt, but also because it could lead the way to therapies that do the same thing as dark chocolate but with better and more consistent results," said Gerald Weissmann, M.D., Editor-in-Chief of The FASEB Journal. 

From NPR:

More Hints That Dad's Age At Conception Helps Shape A Child's Brain

Traditionally, research has focused on women's "biological clock." But in recent years, scientists have been looking more and more at how the father's age at conception might affect the baby, too. A study published Wednesday hints that age really might matter — in terms of the child's mental health.

Researchers from the University of Indiana and the Karolinska Institute found that compared with children fathered by men who were 20-24 years old, kids born to dads who were 45 or older were three times as likely to have autism and 13 times as likely to have ADHD. Kids born to older dads were also more likely to go on to develop substance abuse problems and get lower grades in school. The findings appear in JAMA Psychiatry.

To figure out how paternal age was related to children's psychiatric health, the researchers looked at millions of parents in Sweden who had children between 1973 and 2001. The researchers took into account the mother's age, as well as other demographic factors that might play a role in the child's cognitive development and mental health.

"There's a growing body of literature that suggests that advancing paternal age is associated with a host of problems," D'Onofrio tells Shots. Another study, published in JAMA Psychiatrylast month, found that the children of older fathers seemed to be at greater risk for developing schizophrenia and autism.

D'Onofrio and his colleagues paid special attention to siblings and cousins, and found that even among kids in the same extended family, a dad's age when his child was born made a difference.

The results are in line with a growing body of research linking older fatherhood with various developmental problems in children.

However, the study looks only at how paternal age and children's mental health are associated — it's a correlation, Reichenberg cautions, not a proven causal link. Scientists haven't yet determined the mechanisms of the effect. But it doesn't seem to be simply a matter of overdiagnosis among the children of older parents, the scientists say. Other research has found that as men get older, their sperm cells are more likely to contain random mutations that might, theoretically, contribute to disorders like autism in their kids.

Ultimately, men and women of all ages, he says, should remember that age is only one of many factors influencing the developing baby's health.

"The most important thing is [that] future mothers and fathers should still go ahead and have children, even if the father is older than 45 or 50," Reichenberg says. "Most of these children will be absolutely fine."