Skip to content

A new study that analyzed other studies (a meta-analysis) found that the class of flame retardant chemicals called PBDEs (commonly found in furniture and household products) has an effect on children's intelligence, so that it results in a loss of IQ points. Most of the studies looked at the child's exposure to flame retardants during pregnancy and then later IQ. They found that the child's IQ was reduced by 3.70 points for each ten-fold increase in flame retardant levels (thus, the higher the PBDE levels, the greater the effect on the child's IQ). This is of concern because flame retardants are in so many products around us, both in and out of the home. Older flame retardants (PBDEs) were phased out by 2013, but it turns out that the newer replacements (TBB and TBPH, including Firemaster 550) also get into people and also have negative health effects.

More and more research is finding health problems with flame retardants because they are "not chemically bound" to the products in which they are used - thus they escape over time. and get into us via the skin (dermal), inhalation (from dust), and ingestion (from certain foods and dust on our fingers). And because flame retardants are persistant, they bioaccumulate (they build up over time). They can be measured in our urine and blood. Evidence suggests that flame retardants may be endocrine disruptors, carcinogenic, alter hormone levels, decrease semen quality in men, thyoid disruptors, and act as developmental neurotoxicants (when developing fetus is exposed during pregnancy)  so that children have lowered IQ and more hyperactivity behaviors.

Where are flame retardants found? All around us, and in us. They are so hard to avoid because they're in electronic goods, in upholstered furniture, polyurethane foam, carpet pads, some textiles, the foam in baby items (car seats, bumpers, crib mattresses, strollers,nursing pillows, etc.), house dust, building insulation, and on and on. What to do? Wash hands before eating. Try to use a vacuum cleaner with a HEPA filter. Try to avoid products that say they contain "flame retardants". Only buy upholstered furniture with tags that say they are flame retardant free. The California Childcare Health Program has an information sheet on how to lower exposure to fire retardants. From Medical Xpress:

Flame retardant exposure found to lower IQ in children

A hazardous class of flame retardant chemicals commonly found in furniture and household products damages children's intelligence, resulting in loss of IQ points, according to a new study by UC San Francisco researchers. The study, published Aug. 3, 2017, in Environmental Health Perspectives, included the largest meta-analysis performed on flame retardants to date, and presented strong evidence of polybrominated diphenyl ethers' (PBDE) effect on children's intelligenceDespite a series of bans and phase-outs, nearly everyone is still exposed to PBDE flame retardants, and children are at the most risk," said UCSF's Tracey Woodruff, professor in the Department of Obstetrics, Gynecology and Reproductive Sciences..... 

The findings go beyond merely showing a strong correlation: using rigorous epidemiological criteria, the authors considered factors like strength and consistency of the evidence to establish that there was "sufficient evidence" supporting the link between PBDE exposure and intelligence outcomes. Furthermore, a recent report by the National Academies of Sciences endorsed the study and integrated evidence from animal studies to reach similar conclusions that PBDEs are a "presumed hazard" to intelligence in humans.

Researchers examined data from studies around the world, covering nearly 3,000 mother-child pairs. They discovered that every 10-fold increase in a mom's PBDE levels led to a drop of 3.7 IQ points in her child." "Many people are exposed to high levels of PBDEs, and the more PBDEs a pregnant woman is exposed to, the lower her child's IQ," said Woodruff. "And when the effects of PBDEs are combined with those of other toxic chemicals such as from building products or pesticides, the result is a serious chemical cocktail that our current environmental regulations simply don't account for." The researchers also found some evidence of a link between PDBE exposures and attention deficit hyperactivity disorder (ADHD), but concluded that more studies are necessary to better characterize the relationship.

PBDEs first came into widespread use after California passed fire safety standards for furniture and certain other products in 1975. Thanks to the size of the Californian market, flame retardants soon became a standard treatment for furniture sold across the country..... Mounting evidence of PDBEs' danger prompted reconsideration and starting in 2003 California, other states, and international bodies approved bans or phase outs for some of the most common PBDEs. PBDEs and similar flame retardants are especially concerning because they aren't chemically bonded to the foams they protect. Instead, they are merely mixed in, so can easily leach out from the foam and into house dust, food, and eventually, our bodies. [Original study.]

  Finally some good news regarding ticks and the diseases they can transmit to humans. Currently ticks in the US are known to transmit at least 14 diseases, including Lyme disease. But a recent study done in the Netherlands found that the presence of predators such as foxes resulted in mice and voles having fewer ticks on them. A really big reduction in both tick numbers and the percentage of ticks infected with a disease. The researchers  thought that this was due to the mice and voles being less active when predators were nearby, and also that mice and voles that did venture further were preyed upon and eaten by the predators. So be happy if you see foxes in your neighborhood - they're beneficial. Excerpts from the NY Times:

Lyme Disease’s Worst Enemy? It Might Be Foxes

It is August, the month when a new generation of black-legged ticks that transmit Lyme disease and other viruses are hatching. On forest floors, suburban estates and urban parks, they are looking for their first blood meal. And very often, in the large swaths of North America and Europe where tick-borne disease is on the rise, they are feeding on the ubiquitous white-footed mice and other small mammals notorious for harboring pathogens that sicken humans.

But it doesn’t have to be that way. A new study suggests that the rise in tick-borne disease may be tied to a dearth of traditional mouse predators, whose presence might otherwise send mice scurrying into their burrows. If mice were scarcer, larval ticks, which are always born uninfected, might feed on other mammals and bird species that do not carry germs harmful to humans. Or they could simply fail to find that first meal. Ticks need three meals to reproduce; humans are at risk of contracting diseases only from ticks that have previously fed on infected hosts.

For the study, Tim R. Hofmeester, then a graduate student at Wageningen University in the Netherlands and the lead researcher of the study, placed cameras in 20 plots across the Dutch countryside to measure the activity of foxes and stone martens, key predators of mice. Some were in protected areas, others were in places where foxes are heavily hunted. Over two years, he also trapped hundreds of mice — and voles, another small mammal — in the same plots, counted how many ticks were on them, and tested the ticks for infection with Lyme and two other disease-causing bacteria. To capture additional ticks, he dragged a blanket across the ground.

In the plots where predator activity was higher, he found only 5 to 10 percent as many newly hatched ticks on the mice as in areas where predators were scarcer. Thus, there would be fewer ticks to pass along pathogens to the next generation of mice. In the study, the density of infected “nymphs,” as the adolescent ticks are called, was reduced to 6 percent of previous levels in areas where foxes were more active.“The predators appear to break the cycle of infection,’’ said Dr. Hofmeester, who earned his Ph.D. after the study.

Interestingly, the predator activity in Dr. Hofmeester’s plots did not decrease the density of the mouse population itself, as some ecologists had theorized it might. Instead, the lower rates of infected ticks, Dr. Hofmeester suggested in the paper, published in Proceedings of the Royal Society B, may be the result of small mammals curtailing their own movement when predators are around. [Original study.]

Two more studies found that higher levels of vitamin D in the blood are associated with better health outcomes - one study found a lower risk of breast cancer, especially among postmenopausal women, and in the other - better outcomes after a metastatic melanoma diagnosis.

The breast cancer study suggested that a fairly high blood level of vitamin D (25(OH)D serum level>38.0 ng/mL) was associated with a lower risk of breast cancer. But overall they found that women supplementing with vitamin D (more than 4 times a week) at any dose had a lower risk of breast cancer over a 5 year period than those not supplementing with vitamin D. From Environmental Health Perspectives:

Serum Vitamin D and Risk of Breast Cancer within Five Years

Vitamin D is an environmental and dietary agent with known anticarcinogenic effects, but protection against breast cancer has not been established. We evaluated the association between baseline serum 25-hydroxyvitamin D [25(OH)D] levels, supplemental vitamin D use, and breast cancer incidence over the subsequent 5 y of follow-up. From 2003-2009, the Sister Study enrolled 50,884 U.S. women 35-74 y old who had a sister with breast cancer but had never had breast cancer themselves. Using liquid chromatography-mass spectrometry, we measured 25(OH)D in serum samples from 1,611 women who later developed breast cancer and from 1,843 randomly selected cohort participants.

We found that 25(OH)D levels were associated with a 21% lower breast cancer hazard (highest versus lowest quartile). Analysis of the first 5 y of follow-up for all 50,884 Sister Study participants showed that self-reported vitamin D supplementation was associated with an 11% lower hazard. These associations were particularly strong among postmenopausal women.

In this cohort of women with elevated risk, high serum 25(OH)D levels and regular vitamin D supplement use were associated with lower rates of incident, postmenopausal breast cancer over 5 y of follow-up. These results may help to establish clinical benchmarks for 25(OH)D levels; in addition, they support the hypothesis that vitamin D supplementation is useful in breast cancer prevention.

The first sentence in the melanoma study lays out what is widely known: "Vitamin D deficiency (≤20 ng/mL) is associated with an increased incidence and worse prognosis of various types of cancer including melanoma." Studies show that the relationship between vitamin D, sunlight exposure, and melanoma is complicated in a number of ways, including: sun exposure may be associated with increased survival in patients with melanoma. which may mean that vitamin D has a protective role in patients with melanoma. Several studies suggest that vitamin D may delay melanoma recurrence and improve overall prognosis. The study also found that metastatic melanoma patients with vitamin D deficiency who are unable to or don't raise their vitamin D blood levels (25(OH)D3) have a worse outcome compared to those who are are able to markedly increase (by greater than >20 ng/mL) their 25(OH)D3 levels. From Oncotarget:

Vitamin D deficiency is associated with a worse prognosis in metastatic melanoma

Vitamin D deficiency (≤20 ng/mL) is associated with an increased incidence and worse prognosis of various types of cancer including melanoma. A retrospective, single-center study of individuals diagnosed with melanoma from January 2007 through June 2013 who had a vitamin D (25(OH)D3) level measured within one year of diagnosis was performed to determine whether vitamin D deficiency and repletion are associated with melanoma outcome.

A total of 409 individuals diagnosed with histopathology-confirmed melanoma who had an ever measured serum 25(OH)D3 level were identified. 252 individuals with a 25(OH)D3 level recorded within one year after diagnosis were included in the study .... A worse melanoma prognosis was associated with vitamin D deficiency, higher stage, ulceration, and higher mitotic rate. In patients with stage IV metastatic melanoma, vitamin D deficiency was associated with significantly worse melanoma-specific mortality. Patients with metastatic melanoma who were initially vitamin D deficient and subsequently had a decrease or ≤20 ng/mL increase in their 25(OH)D3 concentration had significantly worse outcomes compared to non-deficient patients who had a >20 ng/mL increase. Our results suggest that initial vitamin D deficiency and insufficient repletion is associated with a worse prognosis in patients with metastatic melanoma.

I've frequently mentioned that when taking vitamin D supplements, the one to take is vitamin D3, and not D2. Medscape (the medical site) has an article explaining that results of a recent study showed that vitamin D3 is twice as effective as D2 in raising blood levels of vitamin D. The vitamin D3 form is derived from animal products, while vitamin D2 is plant-based. So check any supplements you purchase because many contain the vitamin D2 form of vitamin D.

Of course, sunlight is the best because it has more benefits than vitamin D - such as also having low levels of "blue light" which energizes T cells (which are part of the immune system). From Medscape:

Vitamin D3, Not D2, Is Key to Tackling Vitamin D Deficiency

Vitamin D3 is significantly more effective at raising the serum biological marker of vitamin D status than vitamin D2 when given at standard doses in everyday food and drink, say UK researchers — findings that could have major implications for both current guidelines and the supplement industry.

In a randomized controlled trial of vitamin D supplements, vitamin D3, which is derived from animal products, was associated with significantly higher serum total 25-hydroxyvitamin D [25(OH)D] levels after 12 weeks than vitamin D2, which is plant-based and currently used in the vast majority of vitamin D supplements.

"The importance of vitamin D in our bodies is not to be underestimated, but living in the UK it is very difficult to get sufficient levels from its natural source, the sun, so we know it has to be supplemented through our diet," explained lead author Laura Tripkovic, PhD, department of nutritional sciences, University of Surrey, Guildford, United Kingdom, in a press release.

She added, "Our findings show that vitamin D3 is twice as effective as D2 in raising vitamin D levels in the body, which turns current thinking about the two types of vitamin D on its head." "Those who consume D3 through fish, eggs, or vitamin D3-containing supplements are twice as likely to raise their vitamin D status [compared with those] consuming vitamin Drich foods, such as mushrooms, vitamin Dfortified bread, or vitamin Dcontaining supplements, helping to improve their long-term health." [Original study.]

Image result for human sperm, wikipedia The last post discussed the steep ongoing decline in sperm counts and sperm concentration in men from North America, Europe, Australia, and New Zealand. It mentioned a number of environmental causes that could be contributing to this, including the huge increase of chemicals, especially endocrine disruptors (chemicals that disrupt our hormones) over the past few decades.

But another study was also just published that showed (in mice) that effects of chronic exposure to endocrine disrupting chemicals are amplified over 3 generations - and each generation has even lower sperm counts, sperm concentration, and reproductive abnormalities. So each generation gets progressively worse with continued exposure.

As the researchers state: "Our findings suggest that neonatal estrogenic exposure can affect both the reproductive tract and sperm production in exposed males, and exposure effects are exacerbated by exposure spanning multiple generations. Because estrogenic chemicals have become both increasingly common and ubiquitous environmental contaminants in developed countries, the implications for humans are serious. Indeed, it is possible effects are already apparent, with population-based studies from the U.S., Europe, Japan, and China reporting reductions in sperm counts/quality and male fertility within a span of several decades." Yikes...

The World Health Organization (WHO) considers an impairment in ability to fertilize an egg at 40 million sperm per milliliter or below, and the level where WHO considers fertilization unlikely is 15 million sperm per milliliter. This is why the sperm count study discussed in the last post is so frightening: North American, Canadian, Australian, and New Zealand men  whose partners are not yet pregnant nor do they have children (i.e., they are not confirmed fertile men) have experienced a drop in average sperm count of about 50 percent over four decades, to 47 million sperm per milliliter. Niels Skakkebæk, a Danish pediatrician and researcher working on this topic said: "Here in Denmark, there is an epidemic of infertility."and "Most worryingly [in Denmark] is that semen quality is in general so poor that an average young Danish man has much fewer sperm than men had a couple of generations ago, and more than 90 percent of their sperm are abnormal." Uh-oh...What will it take for governments to address this serious issue?

In the meantime, see the last post for some tips on how to reduce your own exposure to endocrine disrupting chemicals. Just note that you can reduce exposure, but you can't totally eliminate exposure. Excerpts from Environmental Health News:

Science: Are we in a male fertility death spiral?

Margaret Atwood's 1985 book, The Handmaid's Tale, played out in a world with declining human births because pollution and sexually transmitted disease were causing sterility. Does fiction anticipate reality? Two new research papers add scientific weight to the possibility that pollution, especially endocrine disrupting chemicals (EDCs), are undermining male fertility.

The first, published Tuesday, is the strongest confirmation yet obtained that human sperm concentration and count are in a long-term decline: more than 50 percent from 1973 to 2013, with no sign that the decline is slowing. "The study is a wakeup that we are in a death spiral of infertility in men," said Frederick vom Saal, Curators’ Distinguished Professor Emeritus of Biological Sciences at the University of Missouri and an expert on endocrine disruption who was not part of either study.

The second study, published last week by different authors, offers a possible explanation. It found that early life exposure of male mouse pups to a model environmental estrogen, ethinyl estradiol, causes mistakes in development in the reproductive tract that will lead to lower sperm counts. According to vom Saal, the second study "provides a mechanistic explanation for a progressive decrease in sperm count over generations." What makes this study unique is that it examined what happened when three successive generations of males were exposed—instead of just looking only at the first. Hunt, in an email, said "we asked a simple question with real-world relevance that had simply never been addressed."

In the real world, since World War II, successive generations of people have been exposed to a growing number and quantity of environmental estrogens—chemicals that behave like the human hormone estrogen. Thousands of papers published in the scientific literature (reviewed here) tie these to a wide array of adverse consequences, including infertility and sperm count decline. This phenomenon—exposure of multiple generations of mammals to endocrine disrupting compounds—had never been studied experimentally, even though that's how humans have experienced EDC exposures for at least the last 70 years. That's almost three generations of human males. Men moving into the age of fatherhood are ground zero for this serial exposure.

So Horan, Hunt and their colleagues at WSU set out to mimic, for the first time, this real-world reality. They discovered that the effects are amplified in successive generations. They observed adverse effects starting in the first generation of mouse lineages where each generation was exposed for a brief period shortly after birth. The impacts worsened in the second generation compared to the first, and by the third generation the scientists were finding animals that could not produce sperm at all. This latter condition was not seen in the first two generations exposed. Details of the experimental results actually suggested that multiple generations of exposure may have increased male sensitivity to the chemical[Original study.]

Once again a study (this time a review and meta-analysis of other studies) found an alarming and steep decline in sperm counts in men from Western countries over a 40 year period. This steep decline for both sperm concentration (SC) and total sperm count (TSC) is for men in North America, Europe, Australia, and New Zealand. The sperm count and sperm concentration declined 50 to 60% in the period between 1973 to 2011 - with a downward slope showing a decline of -1.4% to -1.6% per year. On the other hand, men from South America, Asia and Africa did not show a decline.

The authors of the study were very concerned over the results showing this decline in Western countries, with no evidence of the decline leveling off. As these declines continue, more and more men will have sperm counts below the point at which they can reproduce. Instead they will be infertile or "sub-fertile" (with a decreased probability of conceiving a child). The threshold level associated with a "decreased monthly probability of conception" is considered to be 40 million/ml.

Shockingly - this study found that in 1973 when Western men who were not selected for fertility, and didn't know their fertility status (e.g., college students, men screened for the military) - the average sperm concentration was 99 million/ml, but by 2011 it was 47.1 million/ml. These men were called "unselected" and are likely to be representative of men in the general population. Men known to be fertile (e.g., had fathered a child) were at 83.8 million/ml in 1976, but were down to 62.0 million/ml in 2011. Both groups had consistent declines year after year.

What about the men from South America, Asia, and Africa? There, studies showed that the "unselected" men (not selected for fertility and who didn't know their fertility status) started out at 72.7 million/ml in 1983, and were at 62.6 million/ml in 2011, while men known to be fertile started out on average at 66.4 million/ml in 1978 and were at 75.7 million/ml in 2011. They did not show the decline of the North American, European, Australian, and New Zealand group of men.

What does this mean? And what is going on? These results go beyond fertility and reproduction. The decline is consistent with other male reproductive health indicators over the last few decades: higher incidence of testicular cancer, higher rates of cryptorchidism, earlier onset of male puberty, and decline in average testosterone levels. Instead, it appears that sperm counts of men are "the canary in the mine" for male health - evidence of harm to men from environmental and lifestyle influences.

These Western developed countries are awash in chemicals and plastics, also with endocrine disruptors (hormone disruptors) in our foods, our personal care products, etc - and so studies find these chemicals in all of us (in varying degrees). Same with flame retardants, pesticides, "scented" products. Exposure to all sorts of environmental pollutants - whether in air, water, soil, our food - such as high levels of aluminum. All of these can have an effect on sperm counts and reproductive health.

And note that chemicals that can depress sperm counts  are also linked to many health problems, including chronic diseases.

What can you do?  You can lower your exposure to many chemicals (e.g., pesticides), plastics, and endocrine disruptors, but you can't avoid them totally. Yes, it'll mean reading labels and ingredient lists on foods, personal care products (such as soaps, shampoo, lotion), and products used in the home. [LIST OF THINGS YOU CAN EASILY DO]

TRY TO AVOID OR LOWER EXPOSURE TO: phthalates, parabens, BPA, BPS, and even BPA-free labeled products (all use similar chemicals), flame-retardants (e.g., in upholstered furniture and rugs), stain-resistant, dirt-resistant, waterproof coatings, Scotchgard, non-stick cookware coatings, dryer sheets, scented products (including scented candles and air fresheners), fragrances, pesticides in the yard and home, and "odor-free", antibacterial, antimicrobial, anti-mildew products. Don't microwave foods in plastic containers (including microwave popcorn bags). 

INSTEAD: Try to eat more organic foods, look for organic or least-toxic Integrated Pest Management (IPM) alternatives for the home and garden. Store foods as much as possible in glass, ceramic, or stainless steel containers. Buy foods, if possible, that are in glass bottles - not cans (all lined with endocrine disrupting chemicals) and not plastic bottles or containers (plastics leach). Some people use water filters because there are so many contaminants in our water, even if they meet federal guidelines on "allowable levels" in the water.

Avoid cigarette smoke or smoking. Try to lose weight if overweight. Open windows now and then in your residence to lower indoor air pollution. The list is long - yes, a lifestyle change! (see posts on ENDOCRINE DISRUPTORS, FLAME RETARDANTS, and PESTICIDES)

From Medical Xpress: Study shows a significant ongoing decline in sperm counts of Western men

In the first systematic review and meta-analysis of trends in sperm count, researchers from the Hebrew University-Hadassah Braun School of Public Health and Community Medicine and the Icahn School of Medicine at Mount Sinai report a significant decline in sperm concentration and total sperm count among men from Western countries.

By screening 7,500 studies and conducting a meta-regression analysis on 185 studies between 1973 and 2011, the researchers found a 52.4 percent decline in sperm concentration, and a 59.3 percent decline in total sperm count, among men from North America, Europe, Australia and New Zealand who were not selected based on their fertility status. In contrast, no significant decline was seen in South America, Asia and Africa, where far fewer studies have been conducted. The study also indicates the rate of decline among Western men is not decreasing: the slope was steep and significant even when analysis was restricted to studies with sample collection between 1996 and 2011.

The findings have important public health implications. First, these data demonstrate that the proportion of men with sperm counts below the threshold for subfertility or infertility is increasing. Moreover, given the findings from recent studies that reduced sperm count is related to increased morbidity and mortality, the ongoing decline points to serious risks to male fertility and health.

"Decreasing sperm count has been of great concern since it was first reported twenty-five years ago. This definitive study shows, for the first time, that this decline is strong and continuing. The fact that the decline is seen in Western countries strongly suggests that chemicals in commerce are playing a causal role in this trend," Dr. Shanna H Swan, a professor in the Department of Environmental Medicine and Public Health at the Icahn School of Medicine at Mount Sinai, New York.

While the current study did not examine causes of the observed declines, sperm count has previously been plausibly associated with environmental and lifestyle influences, including prenatal chemical exposure, adult pesticide exposure, smoking, stress and obesity. Therefore, sperm count may sensitively reflect the impact of the modern environment on male health across the lifespan and serve as a "canary in the coal mine" signaling broader risks to male health. [Original study.]

  Human sperm. Credit: Wikipedia

Should tackle football continue to be played in its current form? A study with horrifying results that was published this week in the Journal of the American Medical Association raises that question once again.

The study examined 202 brains of people who had formerly played football for varying lengths of time and at varying levels (some who only played pre-high school, some at high school, college level, semi-professional, or Canadian football league). They found the highest percentage of  the degenerative brain disease chronic traumatic encephalopathy (CTE) among former NFL players (110 out of 111 brains). However, the overall incidence of CTE was 87% when looking at all 202 brains.

They also found that the 3 out of 14 former high school players had mild CTE, but the majority of former college, semiprofessional, and professional players had severe CTE.

The one thing to keep in mind is that the study only examined donated brains of former football players  - which means that the family members were concerned about CTE in the former player (perhaps there were symptoms suggestive of CTE). So we don't know the actual percentage of CTE in currently playing and former football players. But studies (here. here, and here) do show damage from hits received during football games and practice at even the grammar and high school level - and the damage can be from subconcussive hits.

But note that concussions and subconcussive hits (head trauma) also occur in other sports, such as soccer. Everyone agrees we need more studies, and we also need to rethink how some games are played in childhood to protect developing brains.

From NPR: Study: CTE Found In Nearly All Donated NFL Player Brains

As the country starts to get back into its most popular professional team sport, there is a reminder of how dangerous football can be. An updated study published Tuesday by the Journal of the American Medical Association on football players and the degenerative brain disease chronic traumatic encephalopathy reveals a striking result among NFL players. ...continue reading "CTE Found In Majority Of Former Football Player Donated Brains"

Did you know that you exchange some skin microbes with the person you live with? A recent study looked at the microbial communities on different regions of the skin of 10 heterosexual couples living together. The researchers found that cohabitation resulted in microbes being shared, but that a person's own microbes were more important, as well as their biological sex and what region of the skin was sampled. In other words - people's microbes look more like their own microbiome than that of their significant other.

Skin is the largest organ of the body, and it is a protective barrier between a person and its environment. The skin contains a diverse microbial community of largely beneficial and benign microorganisms, and also protects the body from microorganisms with the potential to cause disease. Studies show that at least one million microbes (bacteria, fungi, viruses, archaea, etc.) occupy each square centimeter of skin. Humans shed over one million biological particles per hour.

The researchers also found that female skin microbial communities were more diverse than that of males, and that spending more time outdoors, owning pets, and drinking less alcohol (or none) were all associated with higher levels of microbial skin diversity. They found that a person's biological sex could be determined 100% of the time from microbes on the inner thigh skin. The skin of the feet had the most matched microbes among couples - perhaps when they walk barefoot on floors and the shower, they are sharing microbes (from skin particles that had been shed). From Science Daily:

Not under the skin, but on it: Living together brings couples' microbiomes together

Couples who live together share many things: Bedrooms, bathrooms, food, and even bacteria. After analyzing skin microbiomes from cohabitating couples, microbial ecologists at the University of Waterloo, in Canada, found that people who live together significantly influence the microbial communities on each other's skinThe commonalities were strong enough that computer algorithms could identify cohabitating couples with 86 percent accuracy based on skin microbiomes alone, the researchers report this week in mSystems, an open-access journal of the American Society for Microbiology.

However, the researchers also reported that cohabitation is likely less influential on a person's microbial profile than other factors like biological sex and what part of the body is being studied. In addition, the microbial profile from a person's body usually looks more like their own microbiome than like that of their significant other. "You look like yourself more than you look like your partner," says Ashley Ross, who led the study while a graduate student in the lab of Josh Neufeld.

Neufeld and Ross, together with Andrew Doxey, analyzed 330 skin swabs collected from 17 sites on the participants, all of whom were heterosexual and lived in the Waterloo region. Participants self-collected samples with swabs, and sites included the upper eyelids, outer nostrils, inner nostrils, armpits, torso, back, navel, and palms of hands. Neufeld says the study is the first to identify regions of skin with the most similar microbiomes between partners. They found the strongest similarities on partners' feet. "In hindsight, it makes sense," says Neufeld. "You shower and walk on the same floor barefoot. This process likely serves as a form of microbial exchange with your partner, and also with your home itself." 

The analyses revealed stronger correlations in some sites than in others. For example, microbial communities on the inner thigh were more similar among people of the same biological sex than between cohabiting partners. Computer algorithms could differentiate between men and women with 100 percent accuracy by analyzing inner thigh samples alone, suggesting that a person's biological sex can be determined based on that region, but not others. The researchers also found that the microbial profiles of sites on a person's left side -- like hands, eyelids, armpits, or nostrils -- strongly resemble those on their right side. Of all the swab sites, the least microbial diversity was found on either side of the outer nose[Original study.]

Image result for pills wikipedia All of us at some point or another have wondered if we can hold on to medicines past their expiration date, or do we need to throw them out? And if they're still good past the expiration date, how much past the expiration date? Well... the investigative journalism site ProPublica has been examining this issue, and they published an article saying researchers and the government find that many medicines may be good for YEARS past the expiration date. Yes - years!

According to people interviewed for the article, there are no documented cases of negative effects from taking expired drugs. However, while most medicines are fairly stable and last past the expiration date (they mention the exception of liquid medicines, the asthma inhalant albuterol, the topical rash spray diphenhydramine, and a local anesthetic made from lidocaine and epinephrine). Not mentioned, but there is also another one that degrades and can cause health problems such as Fanconi Syndrome: tetracycline. But other than those, it looks like many medicines are incredibly stable past the expiration date - especially when stored properly and in the original sealed containers. Do go read the whole article - it's eye-opening. Including the government's "Shelf Life Extension Program". Excerpts from ProPublica:

The Myth of Drug Expiration Dates

The box of prescription drugs had been forgotten in a back closet of a retail pharmacy for so long that some of the pills predated the 1969 moon landing. Most were 30 to 40 years past their expiration dates — possibly toxic, probably worthless. But to Lee Cantrell, who helps run the California Poison Control System, the cache was an opportunity to answer an enduring question about the actual shelf life of drugs: Could these drugs from the bell-bottom era still be potent?

Gerona and Cantrell, a pharmacist and toxicologist, knew that the term “expiration date” was a misnomer. The dates on drug labels are simply the point up to which the Food and Drug Administration and pharmaceutical companies guarantee their effectiveness, typically at two or three years. But the dates don’t necessarily mean they’re ineffective immediately after they “expire” — just that there’s no incentive for drugmakers to study whether they could still be usable.

What if the system is destroying drugs that are technically “expired” but could still be safely used? In his lab, Gerona ran tests on the decades-old drugs, including some now defunct brands such as the diet pills Obocell (once pitched to doctors with a portly figurine called “Mr. Obocell”) and Bamadex. Overall, the bottles contained 14 different compounds, including antihistamines, pain relievers and stimulants. All the drugs tested were in their original sealed containers. The findings surprised both researchers: A dozen of the 14 compounds were still as potent as they were when they were manufactured, some at almost 100 percent of their labeled concentrations. “Lo and behold,” Cantrell says, “The active ingredients are pretty darn stable.”

....That raises an even bigger question: If some drugs remain effective well beyond the date on their labels, why hasn’t there been a push to extend their expiration dates? It turns out that the FDA, the agency that helps set the dates, has long known the shelf life of some drugs can be extended, sometimes by years. For decades, the federal government has stockpiled massive stashes of medication, antidotes and vaccines in secure locations throughout the country. The drugs are worth tens of billions of dollars and would provide a first line of defense in case of a large-scale emergency.

Maintaining these stockpiles is expensive. The drugs have to be kept secure and at the proper humidity and temperature so they don’t degrade. Luckily, the country has rarely needed to tap into many of the drugs, but this means they often reach their expiration dates. Though the government requires pharmacies to throw away expired drugs, it doesn’t always follow these instructions itself. Instead, for more than 30 years, it has pulled some medicines and tested their quality.

Once a drug is launched, the makers run tests to ensure it continues to be effective up to its labeled expiration date. Since they are not required to check beyond it, most don’t, largely because regulations make it expensive and time-consuming for manufacturers to extend expiration dates, says Yan Wu, an analytical chemist who is part of a focus group at the American Association of Pharmaceutical Scientists that looks at the long-term stability of drugs. Most companies, she says, would rather sell new drugs and develop additional products. Pharmacists and researchers say there is no economic “win” for drug companies to investigate further. They ring up more sales when medications are tossed as “expired” by hospitals, retail pharmacies and consumers despite retaining their safety and effectiveness.

That being said, it’s an open secret among medical professionals that many drugs maintain their ability to combat ailments well after their labels say they don’t.....In 1986, the Air Force, hoping to save on replacement costs, asked the FDA if certain drugs’ expiration dates could be extended. In response, the FDA and Defense Department created the Shelf Life Extension Program. Each year, drugs from the stockpiles are selected based on their value and pending expiration and analyzed in batches to determine whether their end dates could be safely extended. For several decades, the program has found that the actual shelf life of many drugs is well beyond the original expiration dates.

2006 study of 122 drugs tested by the program showed that two-thirds of the expired medications were stable every time a lot was tested. Each of them had their expiration dates extended, on average, by more than four years, according to research published in the Journal of Pharmaceutical Sciences. Some that failed to hold their potency include the common asthma inhalant albuterol, the topical rash spray diphenhydramine, and a local anesthetic made from lidocaine and epinephrine, the study said. But neither Cantrell nor Dr. Cathleen Clancy, associate medical director of National Capital Poison Center, a nonprofit organization affiliated with the George Washington University Medical Center, had heard of anyone being harmed by any expired drugs. Cantrell says there has been no recorded instance of such harm in medical literature.

 The use of nanoparticles in foods is increasing every year, but we still know very little about whether they have health risks to humans, especially if one is eating foods with them daily (thus having chronic exposure). The nanoparticles in foods are ingredients so small that they are measured in nanometers or billionths of one meter. The most common nanoingredients are: titanium dioxidesilicon dioxide, and zinc oxide.

Titanium dioxide is typically used as a "food coloring" to make foods whiter or brighter, but it may or may not be listed on the label. In Europe, this food additive is known as E171. Currently there are no restrictions on using titanium diaoxide nanoparticles in food.

Recent search suggests that there may be health effects from the nanoparticles in our food (here and here), thus we should be cautious. Evidence is accumulating that titanium dioxide nanoparticles can have a negative inflammatory effect on the intestinal lining.

Similarly, a new study  looking at both mice and humans suggests that individuals with inflammatory intestinal conditions such as intestinal bowel disease (colitis and Crohn's disease) might have negative health effects from titanium dioxide nanoparticles - that they could worsen intestinal inflammation. Interestingly, the nanoparticles accumulated in spleens of mice used in the study. The researchers also found that levels of titanium were increased in the blood of patients with active colitis. From Science Daily:

Titanium dioxide nanoparticles can exacerbate colitis

Titanium dioxide, one of the most-produced nanoparticles worldwide, is being used increasingly in foodstuffs. When intestinal cells absorb titanium dioxide particles, this leads to increased inflammation and damage to the intestinal mucosa in mice with colitis. Researchers at the University of Zurich recommend that patients with colitis should avoid food containing titanium dioxide particles. The frequency of inflammatory bowel disease like Crohn's disease and ulcerative colitis has been on the rise in many Western countries for decades.... In addition to genetic factors, environmental factors like the Western lifestyle, especially nutrition, play an essential role in the development of these chronic intestinal diseases.

The research of Gerhard Rogler, professor of gastroenterology and hepatology at the University of Zurich, now shows that titanium dioxide nanoparticles can intensify the inflammatory reaction in the bodies of patients with inflammatory intestinal diseases. Titanium dioxide is a white pigment used in medicines, cosmetics and toothpaste and increasingly as food additive E171, for example, in icing, chewing gum or marshmallows. Until now, there have been no restrictions on its use in the food industry.

The scientists led by Gerhard Rogler concentrated their research on a protein complex inside cells: the NLRP3 inflammasome. This protein complex is part of the non-specific immune system, which detects danger signals and then triggers inflammation. If the inflammasome is activated by bacterial components, for example, and the inflammatory reaction plays a vital role in the defense against infective agents. In the same way, NLRP3 can be activated by small inorganic particles -- sometimes with negative consequences: If uric acid crystals form in the cells, for example the inflammation leads to gout.

The research team first studied the effect of inorganic titanium dioxide particles in cell cultures. They were able to show that titanium dioxide can penetrate human intestinal epithelial cells and macrophages and accumulate there. The nanoparticles were detected as danger signals by inflammasomes, which triggered the production of inflammatory messengers. In addition, patients with ulcerative colitis, whose intestinal barrier is disrupted, have an increased concentration of titanium dioxide in their blood. "This shows that these particles can be absorbed from food under certain disease conditions," Rogler says.

In a further step, the scientists orally administered titanium dioxide nanoparticles to mice, which serve as a disease model for inflammatory bowel disease. Here, as well, the particles activated the NLRP3 complex, which led to strong intestinal inflammation and greater damage to the intestinal mucosa in the mice. In addition, titanium dioxide crystals accumulated in the animals' spleens. Whether these findings will be confirmed in humans must now be determined in further studies. "Based on our results," Rogler concludes, "patients with an intestinal barrier dysfunction as found in colitis should abstain from foods containing titanium dioxide."  [Original study.]