Skip to content

Image result for thunderstorm wikipediaThunderstorm asthma? This is an asthma attack triggered by a thunderstorm - it is still relatively rare, but predicted to increase with the coming climate changes. During thunderstorms there are downdrafts of cold air which sweep up particles of pollens and mold spores into the clouds. There they absorb moisture and rupture into small, fragments (into a size easily inhaled into the lungs), which then are dispersed by rain and wind. When inhaled they  can enter the lungs and trigger an asthma attack - thunderstorm asthma. [Note that normally larger pollen grains are usually filtered by hairs in the nose - so they don't make it to the lungs.] It seems that people with "hayfever" and allergies to grass pollen are at highest risk - at least in Australia. From Medscape:

Thunderstorm Asthma on the Rise

For seasonal allergy sufferers, rain is usually thought of as a friend—it washes the pollen out of the air. However, there are circumstances in which a particular type of wet weather event can make things much worse: thunderstorms. Asthma epidemics have occurred under such circumstances and have affected patients who have never exhibited asthmatic symptoms before. The most recent severe episode occurred in Melbourne, Australia, in 2016, with 8500 emergency asthma visits and nine deaths.[1]

Recently in the Journal of Allergy and Clinical Immunology, Dr Gennaro D'Amato and colleagues[1] explored the nature of this phenomenon and implications for the future. The authors point out that although rare, these events are expected to occur more often with anticipated climate change. According to the authors, the evidence for this so far is limited to pollen and outdoor mold seasons—but even in the northeastern United States, that is about three quarters of the year.

Pollen grains are large (up to 35 µm in diameter)[2] and usually do not make their way down to the bronchial tree. During a thunderstorm, however, these grains are swept up by a dry updraft, ruptured by high humidity at the cloud base, then forced down by cold air. These smaller grains contains allergens of the just the right size (< 3 µm) to reach the bronchial tree, resulting in asthma symptoms in patients with allergic rhinitis who perhaps have never exhibited asthma before.

The first reported case of thunderstorm asthma was in the United Kingdom in 1983. Since then it has happened in Australia, Canada, the United Kingdom (Alternaria species), the United States, and Italy (olive and Parietaria pollens). Evidence for the role of the Parietaria pollen in the outbreak in Naples, Italy, was supported by high levels of the pollen grains as opposed to low levels of measured particulates and pollutants, including ozone and nitric oxide.

Certainly, people who are sensitized to the relevant allergens are at risk. Beyond that, we can presume that patients who already have poorly controlled asthma or more bronchial hyperresponsiveness would be at risk, as would patients who have other concurrent risk factors for allergic asthma (such as rhinovirus infection).

Image result for psoriasis wikipediaCould probiotics have a role to play in the treatment of psoriasis? A recent analysis and review of studies suggests that they might. Psoriasis is a non-contagious, chronic disease affecting about 2 to 4% of the population, and which is characterized by patches of abnormal skin. These skin patches are typically red, itchy, and scaly, and can cover small areas to covering the entire body. There is no cure for psoriasis, but various treatments can help control the symptoms, such as steroid creams, vitamin D3 cream, ultraviolet light, and immune system suppressing medications. 

What did the researchers find? They said that "new evidence suggests that the microbiome may play a pathogenic role in psoriatic disease" - meaning the community of microbes (microbiome) may be involved in this disease. There is dysbiosis of the skin microbiome (microbial community is out of whack) in areas of skin lesions or patches. Areas of skin lesions had a different microbiome ("lesional psoriatic microbiome") compared to healthy skin - and in these skin lesions or patches some microbial species increase which leads to a decrease or elimination of others. Not just differences in bacteria, but also in fungi and viruses.

in psoriasis the microbial community of the gut is also out of whack (dysbiosis of the gut microbiome). And the gut microbiome is different in those with psoriasis limited to just skin patches, and those with complications of psoriasis (e.g., psoriatic arthritis) - and several studies found that these shifts in the gut microbiome occurred before the psoriatic complications became evident. That suggests that probiotics might help. But which ones?

The researchers state: "Other changes observed in gut microbiome studies include a decrease in Actinobacteria. This may suggest a protective role of Actinobacteria, a phylum which includes Bifidobacterium species that have been shown to reduce intestinal inflammation, suppress autoimmunity, and induce Tregs." They go on to state that one 2013 study by Groeger et al demonstrated that eating Bifidobacteria infantis 35,624 for 6–8 weeks in a randomized, double-blind, placebo-controlled clinical trial reduced inflammatory markers (plasma CRP and TNF-a) in psoriasis patients. Bifidobacterium species, including B. infantis, are commonly found in many multi-strain supplements. So I wonder, what happens if people with psoriasis take them over an extended period? Will the skin psoriasis skin patches improve? This is currently unknown. But...If you've had success with probiotics as a  psoriasis treatment - please let me know. What microbes? And for what symptoms of psoriasis?

From Current Dermatology Reports : The Role of the Skin and Gut Microbiome in Psoriatic Disease

Our review of studies pertaining to the cutaneous microbiome showed a trend towards an increased relative abundance of Streptococcus and a decreased level of Propionibacterium in psoriasis patients compared to controls. In the gut microbiome, the ratio of Firmicutes and Bacteroidetes was perturbed in psoriatic individuals compared to healthy controls. Actinobacteria was also relatively underrepresented in psoriasis patients relative to healthy individuals.

Summary: Although the field of the psoriatic microbiome is relatively new, these first studies reveal interesting differences in microbiome composition that may be associated with the development of psoriatic comorbidities and serve as novel therapeutic targets.

Image result for psoriasis medscape  Psoriasis.  Credit: Medscape

  Finally some good news regarding ticks and the diseases they can transmit to humans. Currently ticks in the US are known to transmit at least 14 diseases, including Lyme disease. But a recent study done in the Netherlands found that the presence of predators such as foxes resulted in mice and voles having fewer ticks on them. A really big reduction in both tick numbers and the percentage of ticks infected with a disease. The researchers  thought that this was due to the mice and voles being less active when predators were nearby, and also that mice and voles that did venture further were preyed upon and eaten by the predators. So be happy if you see foxes in your neighborhood - they're beneficial. Excerpts from the NY Times:

Lyme Disease’s Worst Enemy? It Might Be Foxes

It is August, the month when a new generation of black-legged ticks that transmit Lyme disease and other viruses are hatching. On forest floors, suburban estates and urban parks, they are looking for their first blood meal. And very often, in the large swaths of North America and Europe where tick-borne disease is on the rise, they are feeding on the ubiquitous white-footed mice and other small mammals notorious for harboring pathogens that sicken humans.

But it doesn’t have to be that way. A new study suggests that the rise in tick-borne disease may be tied to a dearth of traditional mouse predators, whose presence might otherwise send mice scurrying into their burrows. If mice were scarcer, larval ticks, which are always born uninfected, might feed on other mammals and bird species that do not carry germs harmful to humans. Or they could simply fail to find that first meal. Ticks need three meals to reproduce; humans are at risk of contracting diseases only from ticks that have previously fed on infected hosts.

For the study, Tim R. Hofmeester, then a graduate student at Wageningen University in the Netherlands and the lead researcher of the study, placed cameras in 20 plots across the Dutch countryside to measure the activity of foxes and stone martens, key predators of mice. Some were in protected areas, others were in places where foxes are heavily hunted. Over two years, he also trapped hundreds of mice — and voles, another small mammal — in the same plots, counted how many ticks were on them, and tested the ticks for infection with Lyme and two other disease-causing bacteria. To capture additional ticks, he dragged a blanket across the ground.

In the plots where predator activity was higher, he found only 5 to 10 percent as many newly hatched ticks on the mice as in areas where predators were scarcer. Thus, there would be fewer ticks to pass along pathogens to the next generation of mice. In the study, the density of infected “nymphs,” as the adolescent ticks are called, was reduced to 6 percent of previous levels in areas where foxes were more active.“The predators appear to break the cycle of infection,’’ said Dr. Hofmeester, who earned his Ph.D. after the study.

Interestingly, the predator activity in Dr. Hofmeester’s plots did not decrease the density of the mouse population itself, as some ecologists had theorized it might. Instead, the lower rates of infected ticks, Dr. Hofmeester suggested in the paper, published in Proceedings of the Royal Society B, may be the result of small mammals curtailing their own movement when predators are around. [Original study.]

  Two more studies found that higher levels of vitamin D in the blood are associated with better health outcomes - one study found a lower risk of breast cancer, especially among postmenopausal women, and in the other - better outcomes after a metastatic melanoma diagnosis.

The breast cancer study suggested that a fairly high blood level of vitamin D (25(OH)D serum level>38.0 ng/mL) was associated with a lower risk of breast cancer. But overall they found that women supplementing with vitamin D (more than 4 times a week) at any dose had a lower risk of breast cancer over a 5 year period than those not supplementing with vitamin D. From Environmental Health Perspectives:

Serum Vitamin D and Risk of Breast Cancer within Five Years

Vitamin D is an environmental and dietary agent with known anticarcinogenic effects, but protection against breast cancer has not been established. We evaluated the association between baseline serum 25-hydroxyvitamin D [25(OH)D] levels, supplemental vitamin D use, and breast cancer incidence over the subsequent 5 y of follow-up. From 2003-2009, the Sister Study enrolled 50,884 U.S. women 35-74 y old who had a sister with breast cancer but had never had breast cancer themselves. Using liquid chromatography-mass spectrometry, we measured 25(OH)D in serum samples from 1,611 women who later developed breast cancer and from 1,843 randomly selected cohort participants.

We found that 25(OH)D levels were associated with a 21% lower breast cancer hazard (highest versus lowest quartile). Analysis of the first 5 y of follow-up for all 50,884 Sister Study participants showed that self-reported vitamin D supplementation was associated with an 11% lower hazard. These associations were particularly strong among postmenopausal women.

In this cohort of women with elevated risk, high serum 25(OH)D levels and regular vitamin D supplement use were associated with lower rates of incident, postmenopausal breast cancer over 5 y of follow-up. These results may help to establish clinical benchmarks for 25(OH)D levels; in addition, they support the hypothesis that vitamin D supplementation is useful in breast cancer prevention.

The first sentence in the melanoma study lays out what is widely known: "Vitamin D deficiency (≤20 ng/mL) is associated with an increased incidence and worse prognosis of various types of cancer including melanoma." Studies show that the relationship between vitamin D, sunlight exposure, and melanoma is complicated in a number of ways, including: sun exposure may be associated with increased survival in patients with melanoma. which may mean that vitamin D has a protective role in patients with melanoma. Several studies suggest that vitamin D may delay melanoma recurrence and improve overall prognosis. The study also found that metastatic melanoma patients with vitamin D deficiency who are unable to or don't raise their vitamin D blood levels (25(OH)D3) have a worse outcome compared to those who are are able to markedly increase (by greater than >20 ng/mL) their 25(OH)D3 levels. From Oncotarget:

Vitamin D deficiency is associated with a worse prognosis in metastatic melanoma

Vitamin D deficiency (≤20 ng/mL) is associated with an increased incidence and worse prognosis of various types of cancer including melanoma. A retrospective, single-center study of individuals diagnosed with melanoma from January 2007 through June 2013 who had a vitamin D (25(OH)D3) level measured within one year of diagnosis was performed to determine whether vitamin D deficiency and repletion are associated with melanoma outcome.

A total of 409 individuals diagnosed with histopathology-confirmed melanoma who had an ever measured serum 25(OH)D3 level were identified. 252 individuals with a 25(OH)D3 level recorded within one year after diagnosis were included in the study .... A worse melanoma prognosis was associated with vitamin D deficiency, higher stage, ulceration, and higher mitotic rate. In patients with stage IV metastatic melanoma, vitamin D deficiency was associated with significantly worse melanoma-specific mortality. Patients with metastatic melanoma who were initially vitamin D deficient and subsequently had a decrease or ≤20 ng/mL increase in their 25(OH)D3 concentration had significantly worse outcomes compared to non-deficient patients who had a >20 ng/mL increase. Our results suggest that initial vitamin D deficiency and insufficient repletion is associated with a worse prognosis in patients with metastatic melanoma.

Image result for human sperm, wikipedia The last post discussed the steep ongoing decline in sperm counts and sperm concentration in men from North America, Europe, Australia, and New Zealand. It mentioned a number of environmental causes that could be contributing to this, including the huge increase of chemicals, especially endocrine disruptors (chemicals that disrupt our hormones) over the past few decades.

But another study was also just published that showed (in mice) that effects of chronic exposure to endocrine disrupting chemicals are amplified over 3 generations - and each generation has even lower sperm counts, sperm concentration, and reproductive abnormalities. So each generation gets progressively worse with continued exposure.

As the researchers state: "Our findings suggest that neonatal estrogenic exposure can affect both the reproductive tract and sperm production in exposed males, and exposure effects are exacerbated by exposure spanning multiple generations. Because estrogenic chemicals have become both increasingly common and ubiquitous environmental contaminants in developed countries, the implications for humans are serious. Indeed, it is possible effects are already apparent, with population-based studies from the U.S., Europe, Japan, and China reporting reductions in sperm counts/quality and male fertility within a span of several decades." Yikes...

The World Health Organization (WHO) considers an impairment in ability to fertilize an egg at 40 million sperm per milliliter or below, and the level where WHO considers fertilization unlikely is 15 million sperm per milliliter. This is why the sperm count study discussed in the last post is so frightening: North American, Canadian, Australian, and New Zealand men  whose partners are not yet pregnant nor do they have children (i.e., they are not confirmed fertile men) have experienced a drop in average sperm count of about 50 percent over four decades, to 47 million sperm per milliliter. Niels Skakkebæk, a Danish pediatrician and researcher working on this topic said: "Here in Denmark, there is an epidemic of infertility."and "Most worryingly [in Denmark] is that semen quality is in general so poor that an average young Danish man has much fewer sperm than men had a couple of generations ago, and more than 90 percent of their sperm are abnormal." Uh-oh...What will it take for governments to address this serious issue?

In the meantime, see the last post for some tips on how to reduce your own exposure to endocrine disrupting chemicals. Just note that you can reduce exposure, but you can't totally eliminate exposure. Excerpts from Environmental Health News:

Science: Are we in a male fertility death spiral?

Margaret Atwood's 1985 book, The Handmaid's Tale, played out in a world with declining human births because pollution and sexually transmitted disease were causing sterility. Does fiction anticipate reality? Two new research papers add scientific weight to the possibility that pollution, especially endocrine disrupting chemicals (EDCs), are undermining male fertility.

The first, published Tuesday, is the strongest confirmation yet obtained that human sperm concentration and count are in a long-term decline: more than 50 percent from 1973 to 2013, with no sign that the decline is slowing. "The study is a wakeup that we are in a death spiral of infertility in men," said Frederick vom Saal, Curators’ Distinguished Professor Emeritus of Biological Sciences at the University of Missouri and an expert on endocrine disruption who was not part of either study.

The second study, published last week by different authors, offers a possible explanation. It found that early life exposure of male mouse pups to a model environmental estrogen, ethinyl estradiol, causes mistakes in development in the reproductive tract that will lead to lower sperm counts. According to vom Saal, the second study "provides a mechanistic explanation for a progressive decrease in sperm count over generations." What makes this study unique is that it examined what happened when three successive generations of males were exposed—instead of just looking only at the first. Hunt, in an email, said "we asked a simple question with real-world relevance that had simply never been addressed."

In the real world, since World War II, successive generations of people have been exposed to a growing number and quantity of environmental estrogens—chemicals that behave like the human hormone estrogen. Thousands of papers published in the scientific literature (reviewed here) tie these to a wide array of adverse consequences, including infertility and sperm count decline. This phenomenon—exposure of multiple generations of mammals to endocrine disrupting compounds—had never been studied experimentally, even though that's how humans have experienced EDC exposures for at least the last 70 years. That's almost three generations of human males. Men moving into the age of fatherhood are ground zero for this serial exposure.

So Horan, Hunt and their colleagues at WSU set out to mimic, for the first time, this real-world reality. They discovered that the effects are amplified in successive generations. They observed adverse effects starting in the first generation of mouse lineages where each generation was exposed for a brief period shortly after birth. The impacts worsened in the second generation compared to the first, and by the third generation the scientists were finding animals that could not produce sperm at all. This latter condition was not seen in the first two generations exposed. Details of the experimental results actually suggested that multiple generations of exposure may have increased male sensitivity to the chemical[Original study.]

Once again a study (this time a review and meta-analysis of other studies) found an alarming and steep decline in sperm counts in men from Western countries over a 40 year period. This steep decline for both sperm concentration (SC) and total sperm count (TSC) is for men in North America, Europe, Australia, and New Zealand. The sperm count and sperm concentration declined 50 to 60% in the period between 1973 to 2011 - with a downward slope showing a decline of -1.4% to -1.6% per year. On the other hand, men from South America, Asia and Africa did not show a decline.

The authors of the study were very concerned over the results showing this decline in Western countries, with no evidence of the decline leveling off. As these declines continue, more and more men will have sperm counts below the point at which they can reproduce. Instead they will be infertile or "sub-fertile" (with a decreased probability of conceiving a child). The threshold level associated with a "decreased monthly probability of conception" is considered to be 40 million/ml. Shockingly - this study found that in 1973 when Western men who were not selected for fertility, and didn't know their fertility status (e.g., college students, men screened for the military) - the average sperm concentration was 99 million/ml, but by 2011 it was 47.1 million/ml. These men were called "unselected" and are likely to be representative of men in the general population. Men known to be fertile (e.g., had fathered a child) were at 83.8 million/ml in 1976, but were down to 62.0 million/ml in 2011. Both groups had consistent declines year after year.

What about the men from South America, Asia, and Africa? There, studies showed that the "unselected" men (not selected for fertility and who didn't know their fertility status) started out at 72.7 million/ml in 1983, and were at 62.6 million/ml in 2011, while men known to be fertile started out on average at 66.4 million/ml in 1978 and were at 75.7 million/ml in 2011. They did not show the decline of the North American, European, Australian, and New Zealand group of men.

What does this mean? And what is going on? These results go beyond fertility and reproduction. The decline is consistent with other male reproductive health indicators over the last few decades: higher incidence of testicular cancer, higher rates of cryptorchidism, earlier onset of male puberty, and decline in average testosterone levels. Instead, it appears that sperm counts of men are "the canary in the mine" for male health - evidence of harm to men from environmental and lifestyle influences. These Western developed countries are awash in chemicals and plastics, also with endocrine disruptors (hormone disruptors) in our foods, our personal care products, etc - and so studies find these chemicals in all of us (in varying degrees). Same with flame retardants, pesticides, "scented" products. Exposure to all sorts of environmental pollutants - whether in air, water, soil, our food - such as high levels of aluminum. All of these can have an effect on sperm counts and reproductive health. And note that chemicals that can depress sperm counts  are also linked to many health problems, including chronic diseases.

What can I do?  You can lower your exposure to many chemicals (e.g., pesticides), plastics, and endocrine disruptors, but you can't avoid them totally. Yes, it'll mean reading labels and ingredient lists on foods, personal care products (such as soaps, shampoo, lotion), and products used in the home. TRY TO AVOID OR LOWER EXPOSURE TO: phthalates, parabens, BPA, BPS, and even BPA-free labeled products (all use similar chemicals), flame-retardants (e.g., in upholstered furniture and rugs), stain-resistant, dirt-resistant, waterproof coatings, Scotchgard, non-stick cookware coatings, dryer sheets, scented products (including scented candles and air fresheners), fragrances, pesticides in the yard and home, and "odor-free", antibacterial, antimicrobial, anti-mildew products. Don't microwave foods in plastic containers (including microwave popcorn bags). 

INSTEAD: Try to eat more organic foods, look for organic or least-toxic Integrated Pest Management (IPM) alternatives for the home and garden. Store foods as much as possible in glass, ceramic, or stainless steel containers. Buy foods, if possible, that are in glass bottles - not cans (all lined with endocrine disrupting chemicals) and not plastic bottles or containers (plastics leach). Some people use water filters because there are so many contaminants in our water, even if they meet federal guidelines on "allowable levels" in the water. Avoid cigarette smoke or smoking. Try to lose weight if overweight. Open windows now and then in your residence to lower indoor air pollution. The list is long - yes, a lifestyle change! (see posts on ENDOCRINE DISRUPTORS, FLAME RETARDANTS, and PESTICIDES)

From Medical Xpress: Study shows a significant ongoing decline in sperm counts of Western men

In the first systematic review and meta-analysis of trends in sperm count, researchers from the Hebrew University-Hadassah Braun School of Public Health and Community Medicine and the Icahn School of Medicine at Mount Sinai report a significant decline in sperm concentration and total sperm count among men from Western countries.

By screening 7,500 studies and conducting a meta-regression analysis on 185 studies between 1973 and 2011, the researchers found a 52.4 percent decline in sperm concentration, and a 59.3 percent decline in total sperm count, among men from North America, Europe, Australia and New Zealand who were not selected based on their fertility status. In contrast, no significant decline was seen in South America, Asia and Africa, where far fewer studies have been conducted. The study also indicates the rate of decline among Western men is not decreasing: the slope was steep and significant even when analysis was restricted to studies with sample collection between 1996 and 2011.

The findings have important public health implications. First, these data demonstrate that the proportion of men with sperm counts below the threshold for subfertility or infertility is increasing. Moreover, given the findings from recent studies that reduced sperm count is related to increased morbidity and mortality, the ongoing decline points to serious risks to male fertility and health.

"Decreasing sperm count has been of great concern since it was first reported twenty-five years ago. This definitive study shows, for the first time, that this decline is strong and continuing. The fact that the decline is seen in Western countries strongly suggests that chemicals in commerce are playing a causal role in this trend," Dr. Shanna H Swan, a professor in the Department of Environmental Medicine and Public Health at the Icahn School of Medicine at Mount Sinai, New York.

While the current study did not examine causes of the observed declines, sperm count has previously been plausibly associated with environmental and lifestyle influences, including prenatal chemical exposure, adult pesticide exposure, smoking, stress and obesity. Therefore, sperm count may sensitively reflect the impact of the modern environment on male health across the lifespan and serve as a "canary in the coal mine" signaling broader risks to male health. [Original study.]

  Human sperm. Credit: Wikipedia

Image result for pills wikipedia All of us at some point or another have wondered if we can hold on to medicines past their expiration date, or do we need to throw them out? And if they're still good past the expiration date, how much past the expiration date? Well... the investigative journalism site ProPublica has been examining this issue, and they published an article saying researchers and the government find that many medicines may be good for YEARS past the expiration date. Yes - years!

According to people interviewed for the article, there are no documented cases of negative effects from taking expired drugs. However, while most medicines are fairly stable and last past the expiration date (they mention the exception of liquid medicines, the asthma inhalant albuterol, the topical rash spray diphenhydramine, and a local anesthetic made from lidocaine and epinephrine). Not mentioned, but there is also another one that degrades and can cause health problems such as Fanconi Syndrome: tetracycline. But other than those, it looks like many medicines are incredibly stable past the expiration date - especially when stored properly and in the original sealed containers. Do go read the whole article - it's eye-opening. Including the government's "Shelf Life Extension Program". Excerpts from ProPublica:

The Myth of Drug Expiration Dates

The box of prescription drugs had been forgotten in a back closet of a retail pharmacy for so long that some of the pills predated the 1969 moon landing. Most were 30 to 40 years past their expiration dates — possibly toxic, probably worthless. But to Lee Cantrell, who helps run the California Poison Control System, the cache was an opportunity to answer an enduring question about the actual shelf life of drugs: Could these drugs from the bell-bottom era still be potent?

Gerona and Cantrell, a pharmacist and toxicologist, knew that the term “expiration date” was a misnomer. The dates on drug labels are simply the point up to which the Food and Drug Administration and pharmaceutical companies guarantee their effectiveness, typically at two or three years. But the dates don’t necessarily mean they’re ineffective immediately after they “expire” — just that there’s no incentive for drugmakers to study whether they could still be usable.

What if the system is destroying drugs that are technically “expired” but could still be safely used? In his lab, Gerona ran tests on the decades-old drugs, including some now defunct brands such as the diet pills Obocell (once pitched to doctors with a portly figurine called “Mr. Obocell”) and Bamadex. Overall, the bottles contained 14 different compounds, including antihistamines, pain relievers and stimulants. All the drugs tested were in their original sealed containers. The findings surprised both researchers: A dozen of the 14 compounds were still as potent as they were when they were manufactured, some at almost 100 percent of their labeled concentrations. “Lo and behold,” Cantrell says, “The active ingredients are pretty darn stable.”

....That raises an even bigger question: If some drugs remain effective well beyond the date on their labels, why hasn’t there been a push to extend their expiration dates? It turns out that the FDA, the agency that helps set the dates, has long known the shelf life of some drugs can be extended, sometimes by years. For decades, the federal government has stockpiled massive stashes of medication, antidotes and vaccines in secure locations throughout the country. The drugs are worth tens of billions of dollars and would provide a first line of defense in case of a large-scale emergency.

Maintaining these stockpiles is expensive. The drugs have to be kept secure and at the proper humidity and temperature so they don’t degrade. Luckily, the country has rarely needed to tap into many of the drugs, but this means they often reach their expiration dates. Though the government requires pharmacies to throw away expired drugs, it doesn’t always follow these instructions itself. Instead, for more than 30 years, it has pulled some medicines and tested their quality.

Once a drug is launched, the makers run tests to ensure it continues to be effective up to its labeled expiration date. Since they are not required to check beyond it, most don’t, largely because regulations make it expensive and time-consuming for manufacturers to extend expiration dates, says Yan Wu, an analytical chemist who is part of a focus group at the American Association of Pharmaceutical Scientists that looks at the long-term stability of drugs. Most companies, she says, would rather sell new drugs and develop additional products. Pharmacists and researchers say there is no economic “win” for drug companies to investigate further. They ring up more sales when medications are tossed as “expired” by hospitals, retail pharmacies and consumers despite retaining their safety and effectiveness.

That being said, it’s an open secret among medical professionals that many drugs maintain their ability to combat ailments well after their labels say they don’t.....In 1986, the Air Force, hoping to save on replacement costs, asked the FDA if certain drugs’ expiration dates could be extended. In response, the FDA and Defense Department created the Shelf Life Extension Program. Each year, drugs from the stockpiles are selected based on their value and pending expiration and analyzed in batches to determine whether their end dates could be safely extended. For several decades, the program has found that the actual shelf life of many drugs is well beyond the original expiration dates.

2006 study of 122 drugs tested by the program showed that two-thirds of the expired medications were stable every time a lot was tested. Each of them had their expiration dates extended, on average, by more than four years, according to research published in the Journal of Pharmaceutical Sciences. Some that failed to hold their potency include the common asthma inhalant albuterol, the topical rash spray diphenhydramine, and a local anesthetic made from lidocaine and epinephrine, the study said. But neither Cantrell nor Dr. Cathleen Clancy, associate medical director of National Capital Poison Center, a nonprofit organization affiliated with the George Washington University Medical Center, had heard of anyone being harmed by any expired drugs. Cantrell says there has been no recorded instance of such harm in medical literature.

 The use of nanoparticles in foods is increasing every year, but we still know very little about whether they have health risks to humans, especially if one is eating foods with them daily (thus having chronic exposure). The nanoparticles in foods are ingredients so small that they are measured in nanometers or billionths of one meter. The most common nanoingredients are: titanium dioxidesilicon dioxide, and zinc oxide. Titanium dioxide is typically used as a "food coloring" to make foods whiter or brighter, but it may or may not be listed on the label. In Europe, this food additive is known as E171. Currently there are no restrictions on using titanium diaoxide nanoparticles in food.

Recent search suggests that there may be health effects from the nanoparticles in our food (here and here), thus we should be cautious. Evidence is accumulating that titanium dioxide nanoparticles can have a negative inflammatory effect on the intestinal lining. Similarly, a new study  looking at both mice and humans suggests that individuals with inflammatory intestinal conditions such as intestinal bowel disease (colitis and Crohn's disease) might have negative health effects from titanium dioxide nanoparticles - that they could worsen intestinal inflammation. Interestingly, the nanoparticles accumulated in spleens of mice used in the study. The researchers also found that levels of titanium were increased in the blood of patients with active colitis. From Science Daily:

Titanium dioxide nanoparticles can exacerbate colitis

Titanium dioxide, one of the most-produced nanoparticles worldwide, is being used increasingly in foodstuffs. When intestinal cells absorb titanium dioxide particles, this leads to increased inflammation and damage to the intestinal mucosa in mice with colitis. Researchers at the University of Zurich recommend that patients with colitis should avoid food containing titanium dioxide particles. The frequency of inflammatory bowel disease like Crohn's disease and ulcerative colitis has been on the rise in many Western countries for decades.... In addition to genetic factors, environmental factors like the Western lifestyle, especially nutrition, play an essential role in the development of these chronic intestinal diseases.

The research of Gerhard Rogler, professor of gastroenterology and hepatology at the University of Zurich, now shows that titanium dioxide nanoparticles can intensify the inflammatory reaction in the bodies of patients with inflammatory intestinal diseases. Titanium dioxide is a white pigment used in medicines, cosmetics and toothpaste and increasingly as food additive E171, for example, in icing, chewing gum or marshmallows. Until now, there have been no restrictions on its use in the food industry.

The scientists led by Gerhard Rogler concentrated their research on a protein complex inside cells: the NLRP3 inflammasome. This protein complex is part of the non-specific immune system, which detects danger signals and then triggers inflammation. If the inflammasome is activated by bacterial components, for example, and the inflammatory reaction plays a vital role in the defense against infective agents. In the same way, NLRP3 can be activated by small inorganic particles -- sometimes with negative consequences: If uric acid crystals form in the cells, for example the inflammation leads to gout.

The research team first studied the effect of inorganic titanium dioxide particles in cell cultures. They were able to show that titanium dioxide can penetrate human intestinal epithelial cells and macrophages and accumulate there. The nanoparticles were detected as danger signals by inflammasomes, which triggered the production of inflammatory messengers. In addition, patients with ulcerative colitis, whose intestinal barrier is disrupted, have an increased concentration of titanium dioxide in their blood. "This shows that these particles can be absorbed from food under certain disease conditions," Rogler says.

In a further step, the scientists orally administered titanium dioxide nanoparticles to mice, which serve as a disease model for inflammatory bowel disease. Here, as well, the particles activated the NLRP3 complex, which led to strong intestinal inflammation and greater damage to the intestinal mucosa in the mice. In addition, titanium dioxide crystals accumulated in the animals' spleens. Whether these findings will be confirmed in humans must now be determined in further studies. "Based on our results," Rogler concludes, "patients with an intestinal barrier dysfunction as found in colitis should abstain from foods containing titanium dioxide."  [Original study.]

 Image result for red meat, wikipedia Red meat allergies from a lone star tick bite? I first read about this a few years ago in Science Daily and it seemed pretty incredible - eat some red meat (beef, pork, or venison) and a few hours later have severe allergy symptoms such as itching, hives, swelling, shortness of breath, vomiting, and diarrhea. And the allergy starts after a person is bitten by a lone star tick.

A few years ago the red meat allergy seemed to occur only in the southeastern United States. But recently the severe red meat allergies are occurring in new places (such as Minnesota and Long island, NY) - so it appears that either the area where this tick lives is spreading or other species of ticks are also now causing this allergy. By the way, once a person has this allergy there is no cure, vaccine, or treatment other than avoiding red meat, treating the allergy symptoms, and carrying an EpiPen (just in case). It is also referred to as Alpha-Gal allergy syndrome because the allergy is to the sugar molecule commonly called alpha-gal which is found in red meat and some medications (such as the cancer drug cetuximab). From Wired:

OH, LOVELY: THE TICK THAT GIVES PEOPLE MEAT ALLERGIES IS SPREADING

First comes the unscratchable itching, and the angry blossoming of hives. Then stomach cramping, and—for the unluckiest few—difficulty breathing, passing out, and even death. In the last decade and a half, thousands of previously protein-loving Americans have developed a dangerous allergy to meat. And they all have one thing in common: the lone star tick.

Red meat, you might be surprised to know, isn’t totally sugar-free. It contains a few protein-linked saccharides, including one called galactose-alpha-1,3-galactose, or alpha-gal, for short. More and more people are learning this the hard way, when they suddenly develop a life-threatening allergy to that pesky sugar molecule after a tick bite.

Yep, one bite from the lone star tick—which gets its name from the Texas-shaped splash of white on its back—is enough to reprogram your immune system to forever reject even the smallest nibble of perfectly crisped bacon. For years, physicians and researchers only reported the allergy in places the lone star tick calls home, namely the southeastern United States. But recently it’s started to spread. The newest hot spots? Duluth, Minnesota, Hanover, New Hampshire, and the eastern tip of Long Island, where at least 100 cases have been reported in the last year. Scientists are racing to trace its spread, to understand if the lone star tick is expanding into new territories, or if other species of ticks are now causing the allergy.

Over the next few years Platts-Mills and his colleague Scott Commins screened more meat allergy patients and discovered that 80 percent reported being bitten by a tick.What’s more, they showed that tick bites led to a 20-fold increase in alpha-gal antibodies. Since ethics standards prevented them from attaching ticks to randomized groups of patients, this data was the best they could do to guess how meat allergy arises. Something in the tick’s saliva hijacks humans’ immune systems, red-flagging alpha-gal, and triggering the massive release of histamines whenever red meat is consumed.Researchers are still trying to find what that something is. 

Whatever it is, allergy researchers will be paying attention. Because, as far as anyone can tell, alpha-gal syndrome seems to be the only allergy that affects all people, regardless of genetic makeup. “There’s something really special about this tick,” says Jeff Wilson, an asthma, allergy, and immunology fellow in Platts-Mills’ group. Usually a mix of genes and environmental factors combine to create allergies. But when it comes to the lone star tick it doesn’t matter if you’re predisposed or not. “Just a few bites and you can render anyone really, really allergic,” he says.

 Lone star tick Credit: CDC Public Image Library

 The following is a study with weird results, really weird results. And it makes me think of all the times I've heard people joke: "just smelling food makes me gain weight", because we all knew it wasn't true. But what if it was true? .... The results of this study done in mice are that actually smelling the food one eats results in weight gain, and not being able to smell the food results in weight loss - even if both groups eat the same amount of food. And the "supersmellers" (those with a "boosted" sense of smell) gained the most weight of all.

What? How could that be? Yes, the study was done in mice, but perhaps it also applies to humans (the researchers think so). The researchers think  that the odor of what we eat may play an important role in how the body deals with calories - if you can't smell your food, you may burn it rather than store it. In other words, a link between smell and metabolism. Excerpts from Science Daily:

Smelling your food makes you fat

Our sense of smell is key to the enjoyment of food, so it may be no surprise that in experiments at the University of California, Berkeley, obese mice who lost their sense of smell also lost weight. What's weird, however, is that these slimmed-down but smell-deficient mice ate the same amount of fatty food as mice that retained their sense of smell and ballooned to twice their normal weight. In addition, mice with a boosted sense of smell—super-smellers—got even fatter on a high-fat diet than did mice with normal smell.

The findings suggest that the odor of what we eat may play an important role in how the body deals with calories. If you can't smell your food, you may burn it rather than store it. These results point to a key connection between the olfactory or smell system and regions of the brain that regulate metabolism, in particular the hypothalamus, though the neural circuits are still unknown. The new study, published this week in the journal Cell Metabolism, implies that the loss of smell itself plays a role, and suggests possible interventions for those who have lost their smell as well as those having trouble losing weight. "Sensory systems play a role in metabolism. Weight gain isn't purely a measure of the calories taken in; it's also related to how those calories are perceived," said senior author Andrew Dillin,...

The smell-deficient mice rapidly burned calories by up-regulating their sympathetic nervous system, which is known to increase fat burning. The mice turned their beige fat cells—the subcutaneous fat storage cells that accumulate around our thighs and midriffs - into brown fat cells, which burn fatty acids to produce heat. Some turned almost all of their beige fat into brown fat, becoming lean, mean burning machines. In these mice, white fat cells—the storage cells that cluster around our internal organs and are associated with poor health outcomes—also shrank in size. The obese mice, which had also developed glucose intolerance - a condition that leads to diabetes—not only lost weight on a high-fat diet, but regained normal glucose tolerance.

On the negative side, the loss of smell was accompanied by a large increase in levels of the hormone noradrenaline, which is a stress response tied to the sympathetic nervous system. In humans, such a sustained rise in this hormone could lead to a heart attack.

Dillin and Riera developed two different techniques to temporarily block the sense of smell in adult mice. .... In both cases, the smell-deficient mice ate as much of the high-fat food as did the mice that could still smell. But while the smell-deficient mice gained at most 10 percent more weight, going from 25-30 grams to 33 grams, the normal mice gained about 100 percent of their normal weight, ballooning up to 60 grams. For the former, insulin sensitivity and response to glucose - both of which are disrupted in metabolic disorders like obesity - remained normal.

Mice that were already obese lost weight after their smell was knocked out, slimming down to the size of normal mice while still eating a high-fat diet. These mice lost only fat weight, with no effect on muscle, organ or bone mass. The UC Berkeley researchers then teamed up with colleagues in Germany who have a strain of mice that are supersmellers, with more acute olfactory nerves, and discovered that they gained more weight on a standard diet than did normal mice[Original study.]