Skip to content

A new study conducted in China found an association between low vitamin D levels and future cognitive decline in older adults. The lower the vitamin D levels at the initial screening (the baseline), the more people with cognitive decline at a 2 year follow-up. There were were no gender differences. (Another study with similar results.) Vitamin D is produced naturally in the skin when exposed to sunlight, and also found in smaller amounts in food such as fish (e.g. salmon) and eggs. Vitamin D helps maintain healthy bones and muscles, but it also plays a key part in brain function and is viewed as neuroprotective. Low levels are associated with greater risk of cardiovascular and neurodegenerative diseases.

The 1,202 participants (60 years or older) in China had their baseline vitamin D levels measured at the start of the study, and their cognitive abilities assessed over two years. What I found interesting in this study was that the vitamin D levels in the people was in general pretty low - this was without any supplementation, thus from sunlight. The researchers specified vitamin D levels (25-Cholecalciferol) in nmol/l, but in the United States values are generally specified in ng/ml. In the study the median level of vitamin D levels in the lowest quartile converted to ng/ml was 10.0 ng/ml, and in the highest quartile the median level was 26.4 ng/ml. With those low numbers, all 4 groups in the United States would be advised to supplement daily with vitamin D (specifically vitamin D3). From Journals of Gerontology: Medical Sciences:

Vitamin D Levels and the Risk of Cognitive Decline in Chinese Elderly People: the Chinese Longitudinal Healthy Longevity Survey

Vitamin D has a neuroprotective function, potentially important for the prevention of cognitive decline. Prospective studies from Western countries support an association between lower vitamin D level and future cognitive decline in elderly people.

This community-based cohort study of elderly people in China follows 1,202 cognitively intact adults aged ≥60 years for a mean duration of 2 years. Plasma vitamin D level was measured at the baseline. Cognitive state of participants was assessed using the Mini-Mental State Examination (MMSE). Cognitive impairment was defined as an MMSE score <18. Cognitive decline was defined as ≥3 points decline from baseline....Participants with low vitamin D level had an increased risk of cognitive decline. This first follow-up study of elderly people, including the oldest-old, in Asia shows that low vitamin D levels were associated with increased risk of subsequent cognitive decline and impairment.

Vitamin D is a secosteroid hormone necessary for maintaining good musculoskeletal health; its deficiency is associated with increased risks of cardiovascular and neurodegenerative diseases. Vitamin D is primarily synthesized in the skin upon exposure to sunlight; smaller amounts are obtained through dietary intake. More recently, enzymes responsible for the synthesis of its active form have been found to be distributed throughout the human brain.... This growing body of evidence suggests that vitamin D has a neuroprotective function that is potentially important for the prevention of cognitive decline. Although the importance of vitamin D cannot be disregarded, there is still no consensus on its optimal level. This is especially pertinent in the elderly people, the oldest-old in particular, as cutaneous synthesis of vitamin D decreases with age. Moreover, their impaired mobility and limited outdoor activities can further exacerbate vitamin D deficiency.

Cross-sectional studies have generally found a positive association between vitamin D status and cognitive performance in older adults. Recent prospective studies from United States and Europe support an association between diminished vitamin D status and future cognitive decline. Since cutaneous synthesis is the main source of vitamin D, there exists great variability in vitamin D levels across populations due to differences in latitude, seasons, and race/ethnicity, such as level of skin pigmentation.

Our findings were consistent with previous cohort studies showing that vitamin D status predicts cognitive decline....A notable observation in the present study is that the association of vitamin D status and cognitive decline were similar in both oldest-old and less elderly people. In this study, there was a clear association between lower 25(OH)D3 level and cognitive impairment in subjects aged ≥80....An additional difference from previous studies is that the current study indicates that the association between vitamin D and cognitive impairment is not gender specific.

The observation of temporal association between 25(OH)D3 levels and subsequent cognitive function supports the notion that vitamin D has a clinically important neuroprotective effect. A wide variety of mechanisms for this effect has been proposed and is supported by animal studies. Vitamin D has been found to modulate neuronal calcium homeostasis, cerebral process of detoxification, immunomodulation, and beta-amyloid clearance.....Further, it was unlikely that vitamin D supplementation would explain the association in this study, as 87% of the participants reported no use of vitamin supplements....In conclusion, our longitudinal study indicates that low 25(OH) D3 levels are associated with subsequent cognitive decline and cognitive impairment

 A lot of research has shown benefits to being bilingual (here, here, and here). Now research suggests that knowing even more languages (multilingualism) may be even better for the brain's flexibility or "neural plasticity". From Science Daily:

EEG recordings prove learning foreign languages can sharpen our minds

Scientists from the Higher School of Economics (HSE) together with colleagues from the University of Helsinki have discovered that learning foreign languages enhances the our brain's elasticity and its ability to code information. The more foreign languages we learn, the more effectively our brain reacts and processes the data accumulated in the course of learning. 

Researchers carried out experiments where the brain's electrical activity was measured with EEG (electroencephalography). Twenty-two students in total (10 male and 12 female) participated in the investigation, with the average age being 24. The subjects had electrodes placed on their heads and then listened to recordings of different words in their native language, as well in foreign languages, both known and completely unknown by the subjects. When the known or unknown words popped up, changes in the brain's activity were tracked.... Apparently, the ability of the brain to quickly process information depends on one's "linguistic anamneses."

The experiment has shown that the brain's electrical activity of those participants who had already known some foreign languages, was higher. The author of the study, Yuriy Shtyrov commented that the more languages someone mastered, the faster the neuron network coding the information on the new words was formed. Consequently, this new data stimulates the brain's physiology: loading the mind with more knowledge boosts its elasticity

Yikes! Another study showing effects from antibiotic use - this time a higher incidence of food allergies in children who took antibiotics in the first year of life. Especially multiple courses of antibiotics, with the strongest association among children receiving cephalosporin and sulfonamide antibiotics. Antibiotics can be life-saving, but there can also be unintended consequences.

As the researchers wrote: "Changes in the composition, richness, and abundance of microbiota that colonize the human gut during infancy has been theorized to play a role in development in atopic disease, including food allergen sensitization. " And what changes the gut microbes? Antibiotics. Other research suggests that alterations in microbes due to childhood antibiotic use may increase the risk of Crohn's disease, obesity, and asthma. From Science Daily:

Young children's antibiotic exposure associated with higher food allergy risk

Antibiotic treatment within the first year of life may wipe out more than an unwanted infection: exposure to the drugs is associated with an increase in food allergy diagnosis, new research from the University of South Carolina suggests.

Analyzing South Carolina Medicaid administrative data from 2007 to 2009, researchers from the College of Pharmacy, School of Medicine and Arnold School of Public Health identified 1,504 cases of children with food allergies and 5,995 controls without food allergies, adjusting for birth month and year, sex and race/ethnicity. Applying conditional logistic regression and adjusting for factors including birth, breastfeeding, asthma, eczema, maternal age and urban residence, the researchers found that children prescribed antibiotics within the first year of life were 1.21 times more likely to be diagnosed with food allergy than children who hadn't received an antibiotic prescription.

The association between antibiotic prescription and development of food allergy was statistically significant, and the odds of a food allergy diagnosis increased with the number of antibiotic prescriptions a child received, growing from 1.31 times greater risk with three prescriptions to 1.43 times with four prescriptions and 1.64 times with five or more prescriptions. The interdisciplinary research team, led by Bryan Love, Pharm.D., found the strongest association between children who were prescribed cephalosporin and sulfonamide antibiotics, which are broad-spectrum therapies (adjusted OR 1.50 and 1.54, respectively), compared with narrower spectrum agents such as penicillins and macrolides. .

This research builds upon previous studies finding that normal gut flora is critical for developing the body's tolerance to foreign proteins such as food. Antibiotics are known to alter the composition of gut flora, and U.S. children ages three months to three years are prescribed 2.2 antimicrobial prescriptions per year on average, according to the literature. The study's results suggest a potential link between the rise in antibiotic prescriptions for young children and the rise in diagnosis of food allergies in children.

A recent study of microbiomes (microbial communities) of patients admitted to intensive care units (ICU) found that they had rapid loss of normal, “health promoting” bacteria", which resulted in the "overgrowth of disease-promoting pathogenic bacteria (dysbiosis), which, in turn, makes patients susceptible to hospital-acquired infections, sepsis, and organ failure". In other words, serious illnesses disrupt human microbial communities, as do treatments, medicines, antibiotics, and lack of proper nutrition in intensive care units. Interestingly, they observed "large depletions of organisms previously thought to confer anti-inflammatory benefits, such as Faecalibacterium". Faecalibacterium prausznitzii has been discussed in other posts as an incredibly important beneficial bacteria for health, a keystone species in the gut (here and here).

The researchers, who took skin, oral, and fecal samples at two time points, expressed surprise over how rapidly the microbial communities changed, and suggested that possible treatments for the micobial communities being out-of-whack (dysbiosis) are "probiotics or with targeted, multimicrobe synthetic “stool pills” that restore a healthy microbiome in the ICU setting to improve patient outcomes." In other words, "restoration of a healthy gut microbiome may be important for improving outcomes in critically ill patients".  Of course.... From Science Daily:

ICU patients lose helpful gut bacteria within days of hospital admission

The microbiome of patients admitted to the intensive care unit (ICU) at a hospital differs dramatically from that of healthy patients, according to a new study published in mSphere. Researchers analyzing microbial taxa in ICU patients' guts, mouth and skin reported finding dysbiosis, or a bacterial imbalance, that worsened during a patient's stay in the hospital. Compared to healthy people, ICU patients had depleted populations of commensal, health-promoting microbes and higher counts of bacterial taxa with pathogenic strains -- leaving patients vulnerable to hospital-acquired infections that may lead to sepsis, organ failure and potentially death

What makes a gut microbiome healthy or not remains poorly defined in the field. Nonetheless, researchers suspect that critical illness requiring a stay in the ICU is associated with the the loss of bacteria that help keep a person healthy. The new study, which prospectively monitored and tracked changes in bacterial makeup, delivers evidence for that hypothesis. "The results were what we feared them to be," says study leader Paul Wischmeyer, an anesthesiologist at the University of Colorado School of Medicine. "We saw a massive depletion of normal, health-promoting species."

Wischmeyer, who will move to Duke University in the fall, runs a lab that focuses on nutrition-related interventions to improve outcomes for critically ill patients. He notes that treatments used in the ICU -- including courses of powerful antibiotics, medicines to sustain blood pressure, and lack of nutrition -- can reduce the population of known healthy bacteria. An understanding of how those changes affect patient outcomes could guide the development of targeted interventions to restore bacterial balance, which in turn could reduce the risk of infection by dangerous pathogens.

Previous studies have tracked microbiome changes in individual or small numbers of critically ill patients, but Wischmeyer and his collaborators analyzed skin, stool, and oral samples from 115 ICU patients across four hospitals in the United States and Canada. They analyzed bacterial populations in the samples twice -- once 48 hours after admission, and again after 10 days in the ICU (or when the patient was discharged). They also recorded what the patients ate, what treatments patients received, and what infections patients incurred.

The researchers compared their data to data collected from a healthy subset of people who participated in the American Gut project dataset. (American Gut is a crowd-sourced project aimed at characterizing the human microbiome by the Rob Knight Lab at the University of California San Diego.) They reported that samples from ICU patients showed lower levels of Firmicutes and Bacteroidetes bacteria, two of the largest groups of microbes in the gut, and higher abundances of Proteobacteria, which include many pathogens.

Wischmeyer was surprised by how quickly the microbiome changed in the patients. "We saw the rapid rise of organisms clearly associated with disease," he says. "In some cases, those organisms became 95 percent of the entire gut flora -- all made up of one pathogenic taxa -- within days of admission to the ICU. That was really striking." Notably, the researchers reported that some of the patient microbiomes, even at the time of admission, resembled the microbiomes of corpses. "That happened in more people than we would like to have seen," he says.....In addition, now that researchers have begun to understand how the microbiome changes in the ICU, Wischmeyer says the next step is to use the data to identify therapies -- perhaps including probiotics -- to restore a healthy bacterial balance to patients.

Image result for meat, fish, eggs It is important to eat a varied diet for health, one that focuses on the food groups (and no - cookies and cake are not necessary foods). The first study looks at liver cancer risk and selenium - which is found in fish, shellfish, meat, milk, eggs, and certain South American nuts, such as Brazil nuts. The second article focuses on colorectal cancer and retinoic acid, a compound derived in the body from vitamin A. Vitamin A rich foods can provide you with retinoic acid, such as the lungs, kidneys, and liver of beef, lamb, pork. Also poultry giblets, eggs, cod liver oil, shrimp, fish, fortified milk, butter, cheddar cheese and Swiss cheese. Red and orange vegetables and fruits such as sweet potatoes, squash, carrots, pumpkins, cantaloupes, apricots, peaches and mangoes all contain significant amounts of beta-carotene, thus retinoids. Note that research generally has found health benefits from real foods, not from supplements.

From Science Daily:  Selenium status influence cancer risk

As a nutritional trace element, selenium forms an essential part of our diet. Researchers have been able to show that high blood selenium levels are associated with a decreased risk of developing liver cancer. Selenium (Se) is found in foods like fish, shellfish, meat, milk and eggs; certain South American nuts, such as Brazil nuts, are also good sources of selenium. It is a trace element that occurs naturally in soil and plants, and enters the bodies of humans and animals via the food they ingest. European soil has a rather low selenium concentration, in comparison with other areas of the world, especially in comparison to North America. Deficiencies of varying degrees of severity are common among the general population, and are the reason why German livestock receive selenium supplements in their feed.

While in Europe, neither a selenium-rich diet nor adequate selenium supplementation is associated with adverse effects, selenium deficiency is identified as a risk factor for a range of diseases. "We have been able to show that selenium deficiency is a major risk factor for liver cancer," says Prof. Dr. Lutz Schomburg of the Institute of Experimental Endocrinology, adding: "According to our data, the third of the population with lowest selenium status have a five- to ten-fold increased risk of developing hepatocellular carcinoma -- also known as liver cancer."....Previous studies had suggested a similar relationship between a person's selenium status and their risk of developing colon cancer, as well as their risk of developing autoimmune thyroid disease. (Original study)

From Science Daily: Retinoic acid suppresses colorectal cancer development, study finds

Retinoic acid, a compound derived in the body from vitamin A, plays a critical role in suppressing colorectal cancer in mice and humans, according to researchers at the Stanford University School of Medicine. Mice with the cancer have lower-than-normal levels of the metabolite in their gut, the researchers found. Furthermore, colorectal cancer patients whose intestinal tissues express high levels of a protein that degrades retinoic acid tend to fare more poorly than their peers.

"The intestine is constantly bombarded by foreign organisms," said Edgar Engleman, MD, professor of pathology and of medicine. "As a result, its immune system is very complex. There's a clear link in humans between inflammatory bowel disease, including ulcerative colitis, and the eventual development of colorectal cancer. Retinoic acid has been known for years to be involved in suppressing inflammation in the intestine. We wanted to connect the dots and learn whether and how retinoic acid levels directly affect cancer development."

"We found that bacteria, or molecules produced by bacteria, can cause a massive inflammatory reaction in the gut that directly affects retinoic acid metabolism," said Engleman. "Normally retinoic acid levels are regulated extremely tightly. This discovery could have important implications for the treatment of human colorectal cancer."

Further investigation showed that retinoic acid blocks or slows cancer development by activating a type of immune cell called a CD8 T cell. These T cells then kill off the cancer cells. In mice, lower levels of retinoic acid led to reduced numbers and activation of CD8 T cells in the intestinal tissue and increased the animals' tumor burden, the researchers found. "It's become very clear through many studies that chronic, smoldering inflammation is a very important risk factor for many types of cancer," said Engleman.

What happens to your brain when you stop exercising? The results of this Univ. of Maryland study should be a wake up call for those who are not quite convinced of exercise's health benefits to the brain. The researchers examined cerebral blood flow in athletes (ages 50-80 years, who were recruited from running clubs) before and after a 10-day period during which they stopped all exercise. Using MRI brain imaging techniques, they found a significant decrease in blood flow to several brain regions important for cognitive health, including the hippocampus, after they stopped their exercise routines.

As the researchers pointed out: "...the take home message is simple -- if you do stop exercising for 10 days, just as you will quickly lose your cardiovascular fitness, you will also experience a decrease in blood brain flow." The only good news was that there were no differences on cognitive measures both before and after stopping exercise for 10 days. From Science Daily:

Use it or lose it: Stopping exercise decreases brain blood flow

We all know that we can quickly lose cardiovascular endurance if we stop exercising for a few weeks, but what impact does the cessation of exercise have on our brains? New research led by University of Maryland School of Public Health researchers examined cerebral blood flow in healthy, physically fit older adults (ages 50-80 years) before and after a 10-day period during which they stopped all exercise. Using MRI brain imaging techniques, they found a significant decrease in blood flow to several brain regions, including the hippocampus, after they stopped their exercise routines.

"We know that the hippocampus plays an important role in learning and memory and is one of the first brain regions to shrink in people with Alzheimer's disease," says Dr. J. Carson Smith, associate professor of kinesiology and lead author of the study, which is published in Frontiers in Aging Neuroscience in August 2016. "In rodents, the hippocampus responds to exercise training by increasing the growth of new blood vessels and new neurons, and in older people, exercise can help protect the hippocampus from shrinking. So, it is significant that people who stopped exercising for only 10 days showed a decrease in brain blood flow in brain regions that are important for maintaining brain health."

The study participants were all "master athletes," defined as people between the ages of 50 and 80 (average age was 61) who have at least 15 years history of participating in endurance exercise and who have recently competed in an endurance event. Their exercise regimens must have entailed at least four hours of high intensity endurance training each week. On average, they were running 36 miles (59 km) each week or the equivalent of a 10K run a day! Not surprisingly, this group had a V02 max (maximum volume of oxygen) above 90% for their age. This is a measure of the maximal rate of oxygen consumption of an individual and reflects their aerobic physical fitness.

Dr. Smith and colleagues measured the velocity of blood flow in brain with an MRI scan while they were still following their regular training routine (at peak fitness) and again after 10 days of no exercise. They found that resting cerebral blood flow significantly decreased in eight brain regions, including the areas of the left and right hippocampus and several regions known to be part of the brain's "default mode network" -- a neural network known to deteriorate quickly with a diagnosis of Alzheimer's disease. This information adds to the growing scientific understanding of the impact of physical activity on cognitive health.

8

Image result for antibiotics Yes, of course this makes sense!.... Many rounds of antibiotics have an effect not just in one area of the body, but kill off both good and bad bacteria in many areas of the human body. The researchers in this study found that taking antibiotics for a reason OTHER THAN SINUSITIS was associated with an increased risk of developing chronic sinusitis (as compared to those people not receiving antibiotics).

Use of antibiotics more than doubles the odds of developing chronic sinusitis without nasal polyps. And this effect lasted for at least 2 years.

Other research has already associated antibiotic use with "decreased microbial diversity" in our microbiome  and with "opportunistic infections" such as Candida albicans and Clostridium difficile. Diseases such as Crohn's disease and diabetes are also linked to antibiotic use. In other words, when there is a disturbance in the microbiome (e.g.from antibiotics) and the community of microbes becomes "out of whack", then pathogenic bacteria are "enriched" (increase) and can dominate.

This study lumped together chronic sinusitis without nasal polyps (CRSsNP) and chronic sinusitis with nasal polyps (CRSwNP), but when the 2 groups are separated out, then antibiotic use was mainly associated with chronic sinusitis without polyps. It appeared that antibiotic exposure did not significantly impact the odds of developing chronic sinusitis with nasal polyps.

The researchers write: "This effect was primarily driven by the CRSsNP subgroup, which also supports the evolving concept of CRSwNP (chronic sinusitis with nasal polyps) as a disease of primary inflammation rather than infection. Despite this, we elected to analyze the CRS (chronic rhinosinusitis) group as a whole because the precise relationship between CRS with and without nasal polyps remains incompletely understood, and it is possible that a proportion of the CRSsNP patients could go on to develop nasal polyps over time."

Which makes me wonder, will giving beneficial bacteria (such as Lactobacillus sakei) to those who have chronic sinusitis with nasal polyps show the same improvement in symptoms as those people without nasal polyps? Or do 2 treatments have to occur at once: something to lower the inflammation (which may be the reason for the nasal polyps) and also beneficial microbes to treat the bacterial imbalance of sinusitis? We just don't know yet. Note that CRS = chronic rhinosinusitis (commonly called chronic sinusitis). Research by A.Z. Maxfield et al from The Laryngoscope :

General antibiotic exposure is associated with increased risk of developing chronic rhinosinusitis 

Antibiotic use and chronic rhinosinusitis (CRS) have been independently associated with microbiome diversity depletion and opportunistic infections. This study was undertaken to investigate whether antibiotic use may be an unrecognized risk factor for developing CRS. Case-control study of 1,162 patients referred to a tertiary sinus center for a range of sinonasal disorders.

Patients diagnosed with CRS according to established consensus criteria (n = 410) were assigned to the case group (273 without nasal polyps [CRSsNP], 137 with nasal polyps [CRSwNP]). Patients with all other diagnoses (n = 752) were assigned to the control group. Chronic rhinosinusitis disease severity was determined using a validated quality of life (QOL) instrument. The class, diagnosis, and timing of previous nonsinusitis-related antibiotic exposures were recorded.

Antibiotic use significantly increased the odds of developing CRSsNP  as compared to nonusers. Antibiotic exposure was significantly associated with worse CRS QOL {Quality of Life} scores over at least the subsequent 2 years. These findings were confirmed by the administrative data review. Use of antibiotics more than doubles the odds of developing CRSsNP and is associated with a worse QOL for at least 2 years following exposure. These findings expose an unrecognized and concerning consequence of general antibiotic use.

Antibiotic use and chronic rhinosinusitis (CRS) have been independently associated with microbiome diversity depletion and opportunistic infections. This study was undertaken to investigate whether antibiotic use may be an unrecognized risk factor for developing CRS.....Antibiotics have also been associated with significant adverse side effects. It has long been recognized that antibiotic use may lead to increased susceptibility to secondary mucosal infections from pathogens including Candida albicans and Clostridium difficile.  Recent studies on the concept of mucosal microbial dysbiosis have suggested that these infections arise as a result of antibiotic induced depletion of the diverse commensal microbial assemblage, which enables the proliferation of pathogenic species.

Chronic rhinosinusitis (CRS) is defined....as having greater than 12 weeks of sinonasal symptoms, along with at least one objective measure of infection or inflammation by nasal endoscopy or radiographic imaging....However the distinct lack of long-term disease resolution following antimicrobial therapy and in some cases surgery, suggests that additional factors are likely involved. Through these studies, CRS with nasal polyps (CRSwNP) has been recognized as an inflammatory subtype characterized by eosinophilic inflammation and a T-helper cell type 2 immunologic profile. Although CRSwNP lacks the features of a classic infectious process, the precise role of bacteria and their byproducts in the promotion of nasal polyp-related inflammation remains unclear.

Recent findings from culture independent investigations of the sinonasal microbiome have offered new insights into the pathogenesis of CRS. These studies have suggested that a decreased microbial diversity exists in CRS patients as compared to healthy controls with a selective enrichment of pathogenic species. Furthermore, some studies have shown that antibiotic exposure may be a risk factor associated with this loss of biodiversity,  echoing the findings seen in postantibiotic C. difficile infections.  Although systemic antibiotics have long been a mainstay of therapy for CRS, these findings lead inexorably to the paradoxical hypothesis that antibiotic exposure may, in fact, promote its onset.

We performed a....case control study of 1,574 patients referred to the Massachusetts Eye and Ear Infirmary Sinus Center in 2014 with symptoms of presumed sinonasal disease.... Inclusion criteria included all antibotic naive patients, and all antibiotic exposed patients for whom antibiotic use was for nonsinonasal-related infections. Among the antibiotic exposed group, only patients who used antibiotics for nonsinonasal-related infections prior to the onset of symptoms of CRS (within the case group) were enrolled in the study.....The case group was further substratified into CRS patients without nasal polyps (CRSsNP, n =273) and with nasal polyps (CRSwNP, n =137) based on the presence of nasal polyps on sinonasal endoscopy.

Among the case patients, 56.34% reported a previous nonsinus-related antibiotic exposure as compared to 42.02% of control patients. Antibiotic use significantly increased the odds of developing both CRSsNP and any form of CRS as compared to nonusers. This odds ratio was similar even when excluding patients who were treated for upper aerodigestive infections. In contrast, antibiotic exposure did not significantly impact the odds of developing CRSwNP. The percent of patients with any form of CRS and CRSsNP only, which was attributable to a previous exposure to antibiotics, was 24.69%  and 33.70%, respectively. In both the case and control groups, the most common class of antibiotic patients received was a penicillin (52.63% vs. 45.77%), and the most common reported reason for antibiotic prescription was the diagnosis of pharyngitis(18.06% vs. 16.67%).

Among the CRS patients (i.e., case group), the use of antibiotics was significantly associated with worse QOL scores as compared to antibiotic-naıve CRS patients. The effect on QOL was enduring because patients who used antibiotics at least 2 years prior to the development of CRS (36.81%) had similar disease severity scores as compared to those with more recent exposures. There was no significant difference in QOL score among patients using different antibiotic classes and among patients with different underlying reasons for antibiotic use.

The human microbiome project has provided new insights into the distribution and abundance of bacterial species in both health and disease. Opportunistic pathogens, as defined by the pathosystems resource integration center, were found nearly ubiquitously in the nares of healthy subjects, albeit at relatively low abundance. Additional studies of the normal nasal cavity found an inverse correlation between the prevalence of Firmicutes such as S. aureus and benign commensal organisms, suggesting a homestatic antagonism between potential pathogens and the remainder of the healthy microbial assemblage. Extrapolation of this concept would therefore predict that events resulting in a perturbation or loss of the commensal microbial community would enable proliferation of pathogenic species, resulting in the disease phenotype. This prediction has borne out in several studies of the sinonasal microbiome in patients with CRS. Feazel et al. found a decreased number of bacterial types and an overabundance of S. aureus among CRS patients as compared to controls. Antibiotic exposure was one of the most significant clinical factors driving this effect. Similar findings were published by Choi et al. and Abreu et al.... Although literature regarding the sinonasal microbiome in health and disease remains nascent, it has provided some limited clues that antibiotics may lead to a reduction of sinonasal microbial biodiversity, which in turn may be a significant feature of CRS.

Our results demonstrate that exposure to antibiotics is a significant risk factor for the development of CRS and accounts for approximately 25% of the disease burden in our study population. These findings harmonize with the predictions of the nascent literature on the sinonasal microbiome. This effect was primarily driven by the CRSsNP subgroup, which also supports the evolving concept of CRSwNP as a disease of primary inflammation rather than infection. Despite this, we elected to analyze the CRS group as a whole because the precise relationship between CRS with and without nasal polyps remains incompletely understood, and it is possible that a proportion of the CRSsNP patients could go on to develop nasal polyps over time.....

One unexpected outcome of our study was that a large percentage of exposures preceeded the onset of the diagnosis of sinusitis by more than 2 years. This indicates that, regardless of the mechanism, the sequelae of antibiotic use may endure much longer then previously thought....The impact of antibiotics on promoting bacterial resistance, and the development of mucosal infections from pathogens such as C. difficile and C. albicans, has been well established. This study demonstrates that antibiotics also significantly increase the risk of developing CRS, an effect that is driven primarily by CRS patients who do not have nasal polyps. Furthermore, premorbid antibiotic use could account for approximately 25% of our patients who developed CRS, and exposure conferred a worse disease-specific quality of life.

Studies are accumulating evidence that the hormone disrupting effects of compounds BPA (bisphenol A) and BPS (the common substitute for BPA) have numerous negative health effects in humans, including reproductive disorders. But now a second BPA substitute - BPSIP - is also being found in humans, and may be even more persistent than BPA and BPS. This is because they're all chemically similar, and all three are endocrine disruptors. This article points out that they have slightly different effects, and when we are exposed to more than one of them (which we are), then the health effects will be even more worrisome.

Unfortunately these plasticizers are in products all around us, and so detected within almost all of us. They're in food packaging containers (and therefore in food), water bottles, can linings, toys, personal care products, thermal paper products such as cash receipts, etc. Canned foods are considered one of the most significant routes of human exposure to bisphenol A (BPA).

Other endocrine disruptors include phthalates - so read personal care product labels to avoid these. Another way to lower exposure to endocrine disruptors is to buy and store food not in plastic containers, but in glass containers or stainless steel. Don't microwave food in any sort of plastic containers. Avoid products with fragrances in them, including air fresheners. Avoid flexible vinyl (e.g. shower curtains). (For all posts on endocrine disruptors, and an article from National Institutes of Health.) From a research article by Liza Gross in PLOS Biology:

Wreaking Reproductive Havoc One Chemical at a Time

Bisphenol A (BPA), unlike DES, remained obscure until the 1950s, when chemists tapped it to make polycarbonate plastics and epoxy resins. BPA now tops the list of high-volume chemicals, and is found in numerous consumer products, including water bottles, food packaging containers and can linings, and thermal paper products like cash receipts and boarding passes (Fig 1). And because it can leach out of products, it’s been detected in the urine of nearly every person tested. It’s also been found in breast milk, follicular and amniotic fluid, cord blood, placental tissue, fetal livers, and the blood of pregnant women ...continue reading "Endocrine Disruptors BPA, BPS, and Now BPSIP"

Can eating a vegetarian diet lower blood pressure? Both this review and other reviews of studies say YES, that those following vegetarian diets have a lower prevalence of hypertension. Overall, the mean prevalence of hypertension was 21% in those consuming a vegetarian diet and 29% in those consuming a nonvegetarian diet (the differences varied between studies).Those following a vegetarian diet also tended to have a healthier lifestyle. As the researchers point out: blood pressure medicine lowers blood pressure for one day, while lifestyle changes (diet, exercise, not smoking, limiting or avoiding alcohol) can lower blood pressure for life. From Medscape:

Vegetarian Diet: A Prescription for High Blood Pressure?

Hypertension is one of the most costly and poorly treated medical conditions in the United States and around the world. Consequences of hypertension include morbidity and mortality related to its long-term effects, which include stroke, myocardial infarction, renal failure, limb loss, aortic aneurysm, and atrial fibrillation, among many others. Although there is an armamentarium of medications to treat hypertension, we do little for prevention. In this review we examine the relationship between vegetarian and nonvegetarian diets and the prevalence of hypertension. 

Current nonpharmacologic treatments include: physical activity (≥ 30 minutes of moderate-intensity activity on most days of the week); smoking cessation; dietary modification (lower sodium, increased potassium; mainly plant-based foods; low-fat foods; reduced-fat dairy products; moderate amounts of lean unprocessed meats, poultry, and fish; and moderate amounts of polyunsaturated and monounsaturated fats, such as olive oil); weight reduction; management of stress; and limited alcohol consumption.

It is well known that hypertension is modulated by dietary influences. In this review we examine vegetarian, vegan, and nonvegetarian (omnivore) diets and prevalence of hypertension among these dietary populations. A vegetarian diet (ie, lacto/ovo-vegetarian) includes plant foods, dairy products, and eggs (excludes all meat, such as turkey, beef, poultry, seafood, bacon, etc.). A vegan diet is similar to vegetarian, except it further excludes dairy products and eggs (no animal or animal products). On the other hand, an omnivore diet (referred to as nonvegetarians throughout this study) includes both plant and animal foods and products.....The majority of studies included in this review addressed vegetarians and vegans as a single group (vegetarians), whereas others differentiated them. Vegetarian diets are known to be low in saturated fat and cholesterol, high in fiber, low in sodium, and high in potassium. These key elements have been shown to correlate with lower incidence of cancer, heart disease, and other chronic diseases, such as diabetes type II, hypertension, and hyperlipidemia.

The exact percentage of those following a vegetarian or vegan diet in the US is unknown; however, a 2014 study found that 221 of 11,399 adult respondents, from a group generally representing the demographics of the US, identified as vegan (0.5%), vegetarians (1.5%), or meat-eaters (98%). The prevalence of hypertension in the US in 2011 was roughly 33.8%.

The mean prevalence of hypertension in those consuming a vegetarian diet was 21% and 29% in those consuming a nonvegetarian diet. The overall prevalence of hypertension among vegetarians was 33% lower than nonvegetarian diets. These data support the hypothesis of a decreased prevalence of hypertension in those maintaining a vegan or vegetarian diet versus a nonvegetarian diet, in cross-sectional, cohort, and case-control studies, and in those consuming a vegan or vegetarian diet according to an experimental dietary change. The blood pressure benefit is noted to disappear in those reverting back to a nonvegetarian diet. 

Overall, these findings support previous reviews and meta-analyses of vegetarian and nonvegetarian diets and blood pressure. A recent meta-analysis that identified 39 studies with 21,915 participants concluded vegetarian diets were associated with a drop in mean systolic (-5.9 mm Hg) and diastolic (-3.5 mm Hg) blood pressures when compared with nonvegetarians. Other reviews had similar conclusions, showing that vegetarians have a lower blood pressure compared with nonvegetarians. Of the studies that included a vegan diet separate from other vegetarians (eg, lacto/ovo), the data show a significantly lower prevalence of hypertension when compared with nonvegetarians and other vegetarians. However, limited research has been conducted on strict, consistent vegan diets.

There are possible rationalizations for the observed associations between diet and hypertension. First, vegetarians have a lower rate of smoking tobacco. Smoking can increase blood pressure acutely and chronically over time.....Second, vegetarians tend to drink less alcohol compared with nonvegetarians. Alcohol, specifically ≥ 2 drinks/day, increases blood pressure by causing vasodilation, followed by a compensatory increase in blood pressure.....Further, vegetarians have a lower mean BMI when compared with nonvegetarians, which means a lower overall weight....Fourth, vegetarians tend to exercise more than nonvegetarians. Vegetarians reported a greater incidence of physical activity of ≥ 30 minutes of moderate to vigorous activity per day.

A limitation of this study is that it remains unclear whether vegetarians are more health conscious and therefore live healthier lives, or whether a predominant diet of fruits and vegetables is a basis for lower blood pressure.

Yikes! A good reason to lose weight now rather than years from now, and the importance of not ignoring a weight gain (you know, over the years as the pounds slowly creep up). The researchers found that for every 10 years of being overweight as an adult, there was an associated 7% increase in the risk for all obesity-related cancers. The degree of overweight (dose-response) during adulthood was important in the risk of developing cancer, especially for endometrial cancer. This study just looked at postmenopausal women, so it is unknown if it applies to men. From Medscape:

Longer Duration of Overweight Increases Cancer Risk in Women

A longer duration of being overweight during adulthood significantly increased the incidence of all cancers that are associated with obesity, a new study in postmenopausal women has concluded. The large population-based study was published August 16 in PLoS Medicine.

Dr Arnold and colleagues found that for every 10 years of being overweight as an adult, there was an associated 7% increase in the risk for all obesity-related cancers. The risk was highest for endometrial cancer (17%) and kidney cancer (16%). For breast cancer, the increased risk was 5%, but no significant associations were found for rectal, liver, gallbladder, pancreatic, ovarian, and thyroid cancer.

When the authors took into account the degree of excess weight over time, the risks were further increased, and there were "clear dose-response relationships," they note. Again, the risk was highest for endometrial cancer. For each additional decade spent with a body mass index (BMI) that was 10 units above normal weight, there was a 37% increase in the risk for endometrial cancer.

Study Details: The researchers used data from the huge American Women's Health Initiative (WHI) trial of postmenopausal women (aged 50 to 79 years at time of study enrollment). For this analysis, the team focused on a cohort of 73,913 postmenopausal women. During a mean follow-up of 12.6 years, 6301 obesity-related cancers were diagnosed. About 40% (n = 29,770) of women in the cohort were never overweight during their adult life....Women who were ever overweight were on average overweight for about 30 years, while those who were ever obese had been so for an average of 20 years. The authors found that the risk of being diagnosed with an obesity-related cancer rose for every 10 years of being overweight.