Skip to content

The reduced risk of stroke was from only a modest 20 grams of protein per day. From Science Daily:

Diet higher in protein may be linked to lower risk of stroke

People with diets higher in protein, especially from fish, may be less likely to have a stroke than those with diets lower in protein, according to a meta-analysis. The meta-analysis looked at all of the available research on the relationship between protein in the diet and the risk of stroke. Seven studies with a total of 254,489 participants who were followed for an average of 14 years were included in the analysis.

"The amount of protein that led to the reduced risk was moderate -- equal to 20 grams per day," said study author Xinfeng Liu, MD, PhD, of Nanjing University School of Medicine in Nanjing, China. 

Overall, the participants with the highest amount of protein in their diets were 20 percent less likely to develop a stroke than those with the lowest amount of protein in their diets. The results accounted for other factors that could affect the risk of stroke, such as smoking and high cholesterol. For every additional 20 grams per day of protein that people ate, their risk of stroke decreased by 26 percent.

Liu noted that the analysis does not support increased consumption of red meat, which has been associated with increased stroke risk. Two of the studies were conducted in Japan, where people eat less red meat than westerners do and more fish, which has been associated with decreased risk of stroke. 

The reduced risk of stroke was stronger for animal protein than vegetable protein. Protein has the effect of lowering blood pressure, which may play a role in reducing stroke risk, Liu said.

Daily sitting for hours on end is no damn good. From Medical Daily:

Too Much Sitting And Watching TV Increases Your Risk Of Certain Cancers: Why Sitting Is The New Smoking

Or the nice scientific write-up of the same study. Bottom line: to lower the risk of cancer, sit less and move more. From Science Daily:

Sedentary behavior increases risk of certain cancers

Physical inactivity has been linked with diabetes, obesity, and cardiovascular disease, but it can also increase the risk of certain cancers, according to a study published June 16 in the JNCI: Journal of the National Cancer Institute.

To assess the relationship between TV viewing time, recreational sitting time, occupational sitting time, and total sitting time with the risk of various cancers, Daniela Schmid, Ph.D., M.Sc., and Michael F. Leitzmann, M.D., Dr.P.H., of the Department of Epidemiology and Preventive Medicine, University of Regensburg, Germany, conducted a meta-analysis of 43 observational studies, including over 4 million individuals and 68,936 cancer cases

When the highest levels of sedentary behavior were compared to the lowest, the researchers found a statistically significantly higher risk for three types of cancer -- colon, endometrial, and lung. Moreover, the risk increased with each 2-hour increase in sitting time, 8% for colon cancer, 10% for endometrial cancer, and 6% for lung cancer, although the last was borderline statistically significant. The effect also seemed to be independent of physical activity, suggesting that large amounts of time spent sitting can still be detrimental to those who are otherwise physically active. TV viewing time showed the strongest relationship with colon and endometrial cancer, possibly, the authors write, because TV watching is often associated with drinking sweetened beverages, and eating junk foods.

The researchers write "That sedentariness has a detrimental impact on cancer even among physically active persons implies that limiting the time spent sedentary may play an important role in preventing cancer…."

An article comparing the U.S. versus the European Union's approach to chemicals in products (including in cosmetics, personal care products, and foods), which explains why a number of chemicals are banned in Europe, but allowed in the U.S. From Ensia:

BANNED IN EUROPE, SAFE IN THE U.S.

Atrazine, which the U.S. Environmental Protection Agency says is estimated to be the most heavily used herbicide in the U.S., was banned in Europe in 2003 due to concerns about its ubiquity as a water pollutant. 

The U.S. Food and Drug Administration places no restrictions on the use of formaldehyde or formaldehyde-releasing ingredients in cosmetics or personal care products. Yet formaldehyde-releasing agents are banned from these products in Japan and Sweden while their levels — and that of formaldehyde — are limited elsewhere in Europe. In the U.S., Minnesota has banned in-state sales of children’s personal care products that contain the chemical.

Use of lead-based interior paints was banned in France, Belgium and Austria in 1909. Much of Europe followed suit before 1940. It took the U.S. until 1978 to make this move, even though health experts had, for decades, recognized the potentially acute — even deadly — and irreversible hazards of lead exposure.

These are but a few examples of chemical products allowed to be used in the U.S. in ways other countries have decided present unacceptable risks of harm to the environment or human health. How did this happen? Are American products less safe than others? Are Americans more at risk of exposure to hazardous chemicals than, say, Europeans?

Not surprisingly, the answers are complex and the bottom line, far from clear-cut. One thing that is evident, however, is that “the policy approach in the U.S. and Europe is dramatically different."

A key element of the European Union’s chemicals management and environmental protection policies — and one that clearly distinguishes the EU’s approach from that of the U.S. federal government — is what’s called the precautionary principleThis principle, in the words of the European Commission, “aims at ensuring a higher level of environmental protection through preventative” decision-making. In other words, it says that when there is substantial, credible evidence of danger to human or environmental health, protective action should be taken despite continuing scientific uncertainty.

In contrast, the U.S. federal government’s approach to chemicals management sets a very high bar for the proof of harm that must be demonstrated before regulatory action is taken.

This is true of the U.S. Toxic Substances Control Act, the federal law that regulates chemicals used commercially in the U.S. The European law regulating chemicals in commerce, known as REACH (Registration, Evaluation, Authorisation and Restriction of Chemicals), requires manufacturers to submit a full set of toxicity data to the European Chemical Agency before a chemical can be approved for use. U.S. federal law requires such information to be submitted for new chemicals, but leaves a huge gap in terms of what’s known about the environmental and health effects for chemicals already in use. Chemicals used in cosmetics or as food additives or pesticides are covered by other U.S. laws — but these laws, too, have high burdens for proof of harm and, like TSCA, do not incorporate a precautionary approach.

While FDA approval is required for food additives, the agency relies on studies performed by the companies seeking approval of chemicals they manufacture or want to use in making determinations about food additive safety. Natural Resources Defense Council senior scientist Maricel Maffini and NRDC senior attorney Tom Neltner “No other developed country that we know of has a similar system in which companies can decide the safety of chemicals put directly into food,” says Maffini.  The two point to a number of food additives allowed in the U.S. that other countries have deemed unsafe

Reliance on voluntary measures is a hallmark of the U.S. approach to chemical regulation. In many cases, when it comes to eliminating toxic chemicals from U.S. consumer products, manufacturers’ and retailers’ own policies — often driven by consumer demand or by regulations outside the U.S. or at the state and local level — are moving faster than U.S. federal policy. 

Cosmetics regulations are more robust in the EU than here,” says Environmental Defense Fund health program director Sarah Vogel. U.S. regulators largely rely on industry information, she says. Industry performs copious testing, but current law does not require that cosmetic ingredients be free of certain adverse health effects before they go on the market. (FDA regulations, for example, do not specifically prohibit the use of carcinogens, mutagens or endocrine-disrupting chemicals.) 

For the FDA to restrict a product or chemical ingredient from cosmetics or personal care products involves a typically long and drawn-out process. What it does more often is to issue advisories.

At the same time, built into the U.S. chemical regulatory system is a large deference to industry. Central to current U.S. policy are cost-benefit analyses with very high bars for proof of harm rather than a proof of safety for entry onto the market. Voluntary measures have moved many unsafe chemical products off store shelves and out of use, but our requirements for proof of harm and the American historical political aversion to precaution mean we often wait far longer than other countries to act.

Another study finding risks from too low vitamin D levels. From Science Daily:

Lower vitamin D level in blood linked to higher premature death rate

Researchers at the University of California, San Diego School of Medicine have found that persons with lower blood levels of vitamin D were twice as likely to die prematurely as people with higher blood levels of vitamin D.

The finding, published in the June 12 issue of American Journal of Public Health, was based on a systematic review of 32 previous studies that included analyses of vitamin D, blood levels and human mortality rates. The specific variant of vitamin D assessed was 25-hydroxyvitamin D, the primary form found in blood.

This new finding is based on the association of low vitamin D with risk of premature death from all causes, not just bone diseases."

Garland said the blood level amount of vitamin D associated with about half of the death rate was 30 ng/ml. He noted that two-thirds of the U.S. population has an estimated blood vitamin D level below 30 ng/ml."This study should give the medical community and public substantial reassurance that vitamin D is safe when used in appropriate doses up to 4,000 International Units (IU) per day," said Heather Hofflich, DO, professor in the UC San Diego School of Medicine's Department of Medicine.

The average age when the blood was drawn in this study was 55 years; the average length of follow-up was nine years. The study included residents of 14 countries, including the United States, and data from 566,583 participants.

Interesting to think of bacteria and biofilms (bacterial communities resistant to treatment) involved in stress related heart attacks. From Science Daily:

Bacteria help explain why stress, fear trigger heart attacks

Scientists believe they have an explanation for the axiom that stress, emotional shock, or overexertion may trigger heart attacks in vulnerable people. Hormones released during these events appear to cause bacterial biofilms on arterial walls to disperse, allowing plaque deposits to rupture into the bloodstream, according to research published in published in mBio®, the online open-access journal of the American Society for Microbiology.

"Our hypothesis fitted with the observation that heart attack and stroke often occur following an event where elevated levels of catecholamine hormones are released into the blood and tissues, such as occurs during sudden emotional shock or stress, sudden exertion or over-exertion" said David Davies of Binghamton University, Binghamton, New York, an author on the study.

Davies and his colleagues isolated and cultured different species of bacteria from diseased carotid arteries that had been removed from patients with atherosclerosis. Their results showed multiple bacterial species living as biofilms in the walls of every atherosclerotic (plaque-covered) carotid artery tested.

In normal conditions, biofilms are adherent microbial communities that are resistant to antibiotic treatment and clearance by the immune system. However, upon receiving a molecular signal, biofilms undergo dispersion, releasing enzymes to digest the scaffolding that maintains the bacteria within the biofilm. These enzymes have the potential to digest the nearby tissues that prevent the arterial plaque deposit from rupturing into the bloodstream. According to Davies, this could provide a scientific explanation for the long-held belief that heart attacks can be triggered by a stress, a sudden shock, or overexertion.

To test this theory they added norepinephrine, at a level that would be found in the body following stress or exertion, to biofilms formed on the inner walls of silicone tubing."At least one species of bacteria -- Pseudomonas aeruginosa -- commonly associated with carotid arteries in our studies, was able to undergo a biofilm dispersion response when exposed to norepinephrine, a hormone responsible for the fight-or-flight response in humans," said Davies. Because the biofilms are closely bound to arterial plaques, the dispersal of a biofilm could cause the sudden release of the surrounding arterial plaque, triggering a heart attack.

To their knowledge, this is the first direct observation of biofilm bacteria within a carotid arterial plaque deposit, says Davies. This research suggests that bacteria should be considered to be part of the overall pathology of atherosclerosis and management of bacteria within an arterial plaque lesion may be as important as managing cholesterol.

Note the red biofilm bacterial colonies within the diseased arterial wall:

Bacteria stained with a fluorescent bacterial DNA probe show up as red biofilm microcolonies within the green tissues of a diseased carotid arterial wall.

An interesting link. The thinking is that the number of moles (cutaneous nevi) are linked to higher levels of sex hormones. From Medical Xpress:

Moles linked to risk for breast cancer

Cutaneous nevi, commonly known as moles, may be a novel predictor of breast cancer, according to two studies published in this week's PLOS Medicine. Jiali Han and colleagues from Indiana University and Harvard University, United States, and Marina Kvaskoff and colleagues from INSERM, France, report that women with a greater number of nevi are more likely to develop breast cancer.

The researchers reached these conclusions by using data from two large prospective cohorts– the Nurses' Health Study in the United States, including 74,523 female nurses followed for 24 years, and the E3N Teachers' Study Cohort in France, including 89,902 women followed for 18 years. 

In the Nurses' Health Study, Han and colleagues asked study participants to report the number of nevi >3mm on their left arm at the initial assessment. They observed that women with 15 or more nevi were 35% more likely to be diagnosed with breast cancer than women who reported no nevi, corresponding to an absolute risk of developing breast cancer of 8.48% in women with no nevi and 11.4% for women with 15 or more nevi. In a subgroup of women, they observed that postmenopausal women with six or more nevi had higher blood levels of estrogen and testosterone than women with no nevi, and that the association between nevi and breast cancer risk disappeared after adjustment for hormone levels.

In the E3N Study, including mostly teachers, Kvaskoff and colleagues asked study participants to report whether they had no, a few, many, or very many moles. They observed that women with "very many" nevi had a 13% higher breast cancer risk than women reporting no nevi, although the association was no longer significant after adjusting for known breast cancer risk factors, especially benign breast disease or family history of breast cancer, which were themselves associated with nevi number.

These studies do not suggest that nevi cause breast cancer, but raise the possibility that nevi are affected by levels of sex hormones, which may be involved in the development of breast cancer. The findings do suggest that the number of nevi could be used as a marker of breast cancer risk, but it is unclear whether or how this information would improve risk estimation based on established risk factors. The accuracy of the findings is limited by the use of self-reported data on nevus numbers. Moreover, these findings may not apply to non-white women given that these studies involved mostly white participants.

From Medical Xpress:

Estimated risk of breast cancer increases as red meat intake increases

Higher red meat intake in early adulthood might be associated with an increased risk of breast cancer, and women who eat more legumes—such as peas, beans and lentils—poultry, nuts and fish might be at lower risk in later life, suggests a paper published BMJ today.

So far, studies have suggested no significant association between  intake and breast cancer. However, most have been based on diet during midlife and later, and many lines of evidence suggest that some exposures, potentially including dietary factors, may have greater effects on the development of breast cancer during early adulthood.

So a team of US researchers investigated the association between dietary protein sources in early adulthood and risk of breast cancer. They analysed data from 88,803 premenopausal women (aged 26 to 45) taking part in the Nurses' Health Study II who completed a questionnaire on diet in 1991. Adolescent food intake was also measured and included foods that were commonly eaten from 1960 to 1980, when these women would have been in high school. 

Medical records identified 2,830 cases of breast cancer during 20 years of follow-up.

This translated to an estimate that higher intake of red meat was associated with a 22% increased risk of breast cancer overall. Each additional serving per day of red meat was associated with a 13% increase in risk of breast cancer (12% in premenopausal and 8% in postmenopausal women).

In contrast, estimates showed a lower risk of breast cancer in postmenopausal women with higher consumption of poultry. Substituting one serving per day of poultry for one serving per day of red meat - in the statistical model - was associated with a 17% lower risk of breast cancer overall and a 24% lower risk of postmenopausal breast cancer.

Furthermore, substituting one serving per day of combined legumes, nuts, poultry, and fish for one serving per day of red meat was associated with a 14% lower risk of breast cancer overall and premenopausal breast cancer.

The authors conclude that higher red meat intake in early adulthood "may be a risk factor for breast cancer, and replacing red meat with a combination of legumes, poultry, nuts and fish may reduce the risk of breast cancer." 

Another benefit to exercising - more microbial diversity. From Medscape:

Exercise Linked to More Diverse Intestinal Microbiome

Professional athletes are big winners when it comes to their gut microflora, suggesting a beneficial effect of exercise on gastrointestinal health, investigators report in an article published online June 9 in Gut.

DNA sequencing of fecal samples from players in an international rugby union team showed considerably greater diversity of gut bacteria than samples from people who are more sedentary.

Having a gut populated with myriad species of bacteria is thought by nutritionists and gastroenterologic researchers to be a sign of good health. Conversely, the guts of obese people have consistently been found to contain fewer species of bacteria, note Siobhan F. Clarke, PhD, from the Teagasc Food Research Centre, Moorepark, Fermoy. "Our findings show that a combination of exercise and diet impacts on gut microbial diversity. In particular, the enhanced diversity of the microbiota correlates with exercise and dietary protein consumption in the athlete group," the authors write.

The investigators used 16S ribosomal RNA amplicon sequencing to evaluate stool and blood samples from 40 male elite professional rugby players (mean age, 29 years) and 46 healthy age-matched control participants. 

Relative to control participants with a high BMI, athletes and control participants with a low BMI had improved metabolic markers. In addition, although athletes had significantly increased levels of creatine kinase, they also had overall lower levels of inflammatory markers than either of the control groups.

Athletes were also found to have more diverse gut microbiota than controls, with organisms in approximately 22 different phyla, 68 families, and 113 genera. Participants with a low BMI were colonized by organisms in just 11 phyla, 33 families, and 65 genera, and participants with a high BMI had even fewer organisms in only 9 phyla, 33 families, and 61 genera.

The professional rugby players, as the investigators expected, had significantly higher levels of total energy intake than the control participants, with protein accounting for 22% of their total intake compared with 16% for control participants with a low BMI and 15% for control participants with a high BMI. When the authors looked for correlations between health parameters and diet with various microbes or microbial diversity, they found significant positive association between microbial diversity and protein intake, creatine kinase levels, and urea.

It seems like the more microbe exposure in the first year of life, the better for the immune system. From Science Daily:

Newborns exposed to dirt, dander, germs may have lower allergy, asthma risk

Infants exposed to rodent and pet dander, roach allergens and a wide variety of household bacteria in the first year of life appear less likely to suffer from allergies, wheezing and asthma, according to results of a study conducted by scientists at the Johns Hopkins Children's Center and other institutions.

Previous research has shown that children who grow up on farms have lower allergy and asthma rates, a phenomenon attributed to their regular exposure to microorganisms present in farm soil. Other studies, however, have found increased asthma risk among inner-city dwellers exposed to high levels of roach and mouse allergens and pollutants. The new study confirms that children who live in such homes do have higher overall allergy and asthma rates but adds a surprising twist: Those who encounter such substances before their first birthdays seem to benefit rather than suffer from them. Importantly, the protective effects of both allergen and bacterial exposure were not seen if a child's first encounter with these substances occurred after age 1, the research found.

"What this tells us is that not only are many of our immune responses shaped in the first year of life, but also that certain bacteria and allergens play an important role in stimulating and training the immune system to behave a certain way."

The study was conducted among 467 inner-city newborns from Baltimore, Boston, New York and St. Louis whose health was tracked over three years.

Infants who grew up in homes with mouse and cat dander and cockroach droppings in the first year of life had lower rates of wheezing at age 3, compared with children not exposed to these allergens soon after birth. The protective effect, moreover, was additive.  In addition, infants in homes with a greater variety of bacteria were less likely to develop environmental allergies and wheezing at age 3.

When researchers studied the effects of cumulative exposure to both bacteria and mouse, cockroach and cat allergens, they noticed another striking difference. Children free of wheezing and allergies at age 3 had grown up with the highest levels of household allergens and were the most likely to live in houses with the richest array of bacterial species. Some 41 percent of allergy-free and wheeze-free children had grown up in such allergen and bacteria-rich homes. By contrast, only 8 percent of children who suffered from both allergy and wheezing had been exposed to these substances in their first year of life.

Yesterday I read and reread a very interesting journal review paper from Sept. 2013 that discussed recent studies about probiotics and treatment of respiratory ailments, including sinusitis. Two of the authors are those from the Abreu et al sinusitis study from 2012 (that I've frequently mentioned and that guided our own Sinusitis Treatment) that found that Lactobacillus sakei protects against sinusitis and treats sinusitis. Some of the things this paper discussed are: microbial communities in the airways and sinuses vary between healthy and non-healthy individuals (and each area or niche seems to have distinct communities), that lactic acid bacteria (including Lactobacillus sakei) are generally considered the "good guys" in our sinus microbiomes (the communities of microbes living in our sinuses), and that treatments of the future could consist of "direct localized administration of microbial species" (for example, getting the bacteria directly into the sinuses through the nasal passages with a nasal spray, or dabbing fermented kimchi juice like I did). They also mentioned that maybe one could also get probiotics to the GI tract (e.g., by eating probiotics) and maybe this would have some benefits. So far it seems that administering something containing L.sakei directly (by nasal spray or dabbing kimchi juice - as I did) seems to work best for treating sinusistis.

They also discussed that lactic acid bacteria are found in healthy mucosal surfaces in the respiratory, GI, and vaginal tract. They then proposed that lactic acid bacteria (including L.sakei) act as pioneer, or keystone species, and that they act to shape mucosal ecosystems (the microbiomes), and permit other species to live there that share similar attributes, and so promote "mucosal homeostasis". It appears that having a healthy sinus microbiome protects against pathogenic species.

So yeah - the bottom line is that microbial supplementation of beneficial bacteria seems very promising in the treatment of respiratory ailments. And for long-term successful sinusitis treatment, one would need to improve the entire sinus microbial community (with a "mixed species supplement"), not just one bacteria species. (By the way, maybe that is also why using kimchi in our successful Sinusitis Treatment works - it is an entire microbial community with several lactic acid species, including the all important Lactobacillus sakei. (NOTE: See Sinusitis Treatment Summary page and The One Probiotic That Treats Sinusitis for some easy methods  using various probiotics to treat chronic sinusitis. These articles get updated frequently.) From Trends in Microbiology:

Probiotic strategies for treatment of respiratory diseases.

More recently, Abreu et al. profiled the sinus microbiome of CRS (chronic rhinosinusitis) patients and healthy controls at high resolution [2]. Microbial burden was not significantly different between healthy subject and CRS patient sinuses. Moreover, known bacterial pathogens such as H. influenza, P. aeruginosa, and S. aureus were detected in both healthy and CRS sinuses; however, the sinus microbiome of CRS patients exhibited characteristics of community collapse, in other words many microbial species associated with healthy individuals, in particular lactic acid bacteria, were significantly reduced in relative abundance in CRS patients. In this state of microbiome depletion, the species C. tuberculostearicum was significantly enriched. This indicates that composition of the microbiome is associated with disease status and appears to influence the activity of pathogens within these assemblages.

Although sinusitis patients in the Abreu study exhibited hallmark characteristics of community collapse, the comparator group – healthy individuals – represented an opportunity to mine microbiome data and identify those bacterial species specific to the sinus niche that putatively protect this site. The authors demonstrated that a relatively diverse group of phylogenetically distinct lactic acid bacteria were enriched in the healthy sinus microbiota [2]. As proof of principle that the sinonasal microbiome itself or indeed specific members of these consortia protect the mucosal surface from pathogenic effects, a series of murine studies were undertaken. These demonstrated that a replete, unperturbed sinus microbiome prevented C. tuberculostearicum pathogenesis. Moreover, even in the context of an antimicrobial-depleted microbiome, Lactobacillus sakei when co-instilled with C. tuberculostearicum into the nares of mice afforded complete mucosal protection against the pathogenic species. Although this is encouraging, it is unlikely that a single species can confer long-term protection in a system that is inherently multi-species and constantly exposed to the environment. Indeed, previous studies and ecological theory supports the hypothesis that multi-species consortia represent more robust assemblages, and tend to afford improved efficacy with respect to disease or infection outcomes [44,45]. This study therefore provides a basis for the identification of what may be termed a minimal microbial population (MMP) composed of multiple phylogenetically distinct lactic acid bacteria, including L. sakei. Such a mixed species assemblage would form the foundation of a rationally designed, sinus-specific bacterial supplement to combat established chronic diseases or, indeed, be used prophylactically to protect mucosal surfaces against acute infection.

Therefore, although site-specific diseases such as chronic sinusitis may well be confined to the sinus niche and be resolved simply by localized microbe-restoration approaches, it is also entirely plausible that an adjuvant oral microbe-supplementation strategy and dietary intervention (to sustain colonization by the introduced species) may increase efficacy and ultimately improve long-term patient outcomes. This two-pronged approach may be particularly efficacious for patients who have lost protective GI microbial species due to
administration of multiple courses of oral antimicrobials to manage their sinus disease.

Although it is impossible to define the precise strains or species that will be used in future microbial supplementation strategies to treat chronic inflammatory diseases, there is a convergence of evidence indicating that healthy mucosal surfaces in the respiratory, GI, and vaginal tract are colonized by lactic acid bacteria. We would venture that members of this group act as pioneer, keystone species that, through their multitude of functions (including bacteriocin production, competitive colonization, lactate and fatty acid production), can shape mucosal ecosystems, thereby permitting co-colonization by phylogenetically distinct
species that share functionally similar attributes. Together, these subcommunities promote mucosal homeostasis and represent the most promising species for future microbe-supplementation strategies.