Skip to content

Head lice is a big, big concern for parents of young school age children. From Science Daily:

Ordinary conditioner removes head lice eggs as effectively as special products

Eggs from head lice, also called nits, are incredibly difficult to remove. Female lice lay eggs directly onto strands of hair, and they cement them in place with a glue-like substance, making them hard to get rid of. In fact, the eggs are glued down so strongly that they will stay in place even after hair has been treated with pediculicides -- substances used to kill lice.

Some shampoos and conditioners that contain chemicals or special oils are marketed as nit-removal products. However, new research just published in the Journal of Medical Entomology shows that ordinary hair conditioner is just as effective.

In an article called "Efficacy of Products to Remove Eggs of Pediculus humanus capitis (Phthiraptera: Pediculidae) From the Human Hair," scientists from Belgium gathered 605 hairs from six different children. Each hair had a single nit attached to it. Approximately 14% of the eggshells contained a dead egg, whereas the rest were empty.

They then tried to remove the eggs and tested the amount of force needed to do so. They found that nits on the hairs that were left completely untreated were the most difficult to remove. Eggs on hairs that had been soaked in deionized water were much easier to remove, as were the eggs on hairs that had been treated with ordinary hair conditioner and with products specifically marketed for the purpose of nit removal.

However, they found no significant differences between the ordinary conditioners and the special nit-removal products. In all cases, less force was required to remove the nits after the hair had been treated, but the effectiveness of the products was essentially the same.

The authors hypothesize that the deionized water was effective because it acts as a lubricant, so less friction is needed to remove the nits from the hairs. The same goes for the conditioners.

From Science Daily:

Dropped your toast? Five-second food rule exists, new research suggests

Food picked up just a few seconds after being dropped is less likely to contain bacteria than if it is left for longer periods of time, according to the findings of research carried out at Aston University's School of Life and Health Sciences.

The findings suggest there may be some scientific basis to the '5 second rule' -- the urban myth about it being fine to eat food that has only had contact with the floor for five seconds or less. Although people have long followed the 5 second rule, until now it was unclear whether it actually helped.

The study, undertaken by final year Biology students and led by Anthony Hilton, Professor of Microbiology at Aston University, monitored the transfer of the common bacteria Escherichia coli (E. coli) and Staphylococcus aureus from a variety of indoor floor types (carpet, laminate and tiled surfaces) to toast, pasta, biscuit and a sticky sweet when contact was made from 3 to 30 seconds.

The results showed that:  - Time is a significant factor in the transfer of bacteria from a floor surface to a piece of food; and  - The type of flooring the food has been dropped on has an effect, with bacteria least likely to transfer from carpeted surfaces and most likely to transfer from laminate or tiled surfaces to moist foods making contact for more than 5 seconds.

Professor Hilton said: "Consuming food dropped on the floor still carries an infection risk as it very much depends on which bacteria are present on the floor at the time; however the findings of this study will bring some light relief to those who have been employing the five-second rule for years, despite a general consensus that it is purely a myth. We have found evidence that transfer from indoor flooring surfaces is incredibly poor with carpet actually posing the lowest risk of bacterial transfer onto dropped food.

Two studies about Vitamin D. From Science Daily:

Vitamin D increases breast cancer patient survival, study shows

Breast cancer patients with high levels of vitamin D in their blood are twice as likely to survive the disease as women with low levels of this nutrient, report University of California, San Diego School of Medicine researchers in the March issue of Anticancer Research.

In previous studies, Cedric F. Garland, DrPH, professor in the Department of Family and Preventive Medicine, showed that low vitamin D levels were linked to a high risk of premenopausal breast cancer. That finding, he said, prompted him to question the relationship between 25-hydroxyvitamin D -- a metabolite produced by the body from the ingestion of vitamin D -- and breast cancer survival rates.

Garland and colleagues performed a statistical analysis of five studies of 25-hydroxyvitamin D obtained at the time of patient diagnosis and their follow-up for an average of nine years. Combined, the studies included 4,443 breast cancer patients.

Women in the high serum group had an average level of 30 nanograms per milliliter (ng/ml) of 25-hydroxyvitamin D in their blood. The low group averaged 17 ng/ml. The average level in patients with breast cancer in the United States is 17 ng/ml.

A 2011 meta-analysis by Garland and colleagues estimated that a serum level of 50 ng/ml is associated with 50 percent lower risk of breast cancer. While there are some variations in absorption, those who consume 4,000 International Units (IU) per day of vitamin D from food or a supplement normally would reach a serum level of 50 ng/ml. 

From Science Daily:

Vitamin D deficiency may compromise immune function

Older individuals who are vitamin D deficient also tend to have compromised immune function, according to new research accepted for publication in the Endocrine Society's Journal of Clinical Endocrinology & Metabolism (JCEM).

"Our data suggest vitamin D may be involved in maintaining the health of the immune system as well as the skeletal system," said one of the study's authors, Mary Ward, PhD, of the University of Ulster in Coleraine, U.K.

The observational study of 957 Irish adults who were at least 60 years old examined vitamin D levels as well as biomarkers of inflammation. Participants who were vitamin D deficient were more likely to have high levels of these biomarkers, which are linked to cardiovascular disease and inflammatory conditions such as multiple sclerosis and rheumatoid arthritis.

From Science Daily:

Meat and cheese may be as bad for you as smoking

That chicken wing you're eating could be as deadly as a cigarette. In a new study that tracked a large sample of adults for nearly two decades, researchers have found that eating a diet rich in animal proteins during middle age makes you four times more likely to die of cancer than someone with a low-protein diet -- a mortality risk factor comparable to smoking.

"There's a misconception that because we all eat, understanding nutrition is simple. But the question is not whether a certain diet allows you to do well for three days, but can it help you survive to be 100?" said corresponding author Valter Longo, the Edna M. Jones Professor of Biogerontology at the USC Davis School of Gerontology and director of the USC Longevity Institute.

Not only is excessive protein consumption linked to a dramatic rise in cancer mortality, but middle-aged people who eat lots of proteins from animal sources -- including meat, milk and cheese -- are also more susceptible to early death in general, reveals the study to be published March 4 in Cell Metabolism. Protein-lovers were 74 percent more likely to die of any cause within the study period than their more low-protein counterparts. They were also several times more likely to die of diabetes.

But how much protein we should eat has long been a controversial topic -- muddled by the popularity of protein-heavy diets such as Paleo and Atkins. Before this study, researchers had never shown a definitive correlation between high protein consumption and mortality risk.

Rather than look at adulthood as one monolithic phase of life, as other researchers have done, the latest study considers how biology changes as we age, and how decisions in middle life may play out across the human lifespan.

In other words, what's good for you at one age may be damaging at another. Protein controls the growth hormone IGF-I, which helps our bodies grow but has been linked to cancer susceptibility. Levels of IGF-I drop off dramatically after age 65, leading to potential frailty and muscle loss. The study shows that while high protein intake during middle age is very harmful, it is protective for older adults: those over 65 who ate a moderate- or high-protein diet were less susceptible to disease.

"The research shows that a low-protein diet in middle age is useful for preventing cancer and overall mortality, through a process that involves regulating IGF-I and possibly insulin levels," said co-author Eileen Crimmins, the AARP Chair in Gerontology at USC. "However, we also propose that at older ages, it may be important to avoid a low-protein diet to allow the maintenance of healthy weight and protection from frailty."

Crucially, the researchers found that plant-based proteins, such as those from beans, did not seem to have the same mortality effects as animal proteins. Rates of cancer and death also did not seem to be affected by controlling for carbohydrate or fat consumption, suggesting that animal protein is the main culprit.

People who ate a moderate amount of protein were still three times more likely to die of cancer than those who ate a low-protein diet in middle age, the study shows. Overall, even the small change of decreasing protein intake from moderate levels to low levels reduced likelihood of early death by 21 percent.

 

From Medscape:

Allergic Rhinitis Patients Live Longer

Their runny noses might drive them crazy, but people with allergic rhinitis are likely to outlive the rest of us, a new study suggests.

"We found that allergic rhinitis patients had a decreased risk of heart attack, a decreased risk of stroke and, most strikingly, a decreased risk of all-cause mortality," said lead investigator Angelina Crans Yoon, MD, from the Department of Allergy and Clinical Immunology at the Kaiser Permanente Los Angeles Medical Center.

"They were basically half as likely to die during the study period," she told Medscape Medical News. 

Researchers studying data from the National Health and Nutrition Examination Survey (NHANES) found that people who tested positive for allergies were less likely to suffer cardiovascular events.

To explore the issue further, Dr. Crans Yoon and her team looked at a database of Southern California patients.The cohort consisted of 109,229 patients with allergic rhinitis and 109,229 people without allergic rhinitis who were matched for age, sex, and ethnicity. It also consisted of 92,775 patients with asthma who were matched with a similar group without asthma.

Risk for acute myocardial infarction was 25% lower in patients with allergic rhinitis than in those without, risk for a cerebrovascular event was 19% lower, and risk for all-cause mortality was 49% lower. Risk for all cardiovascular events was similar in the allergic rhinitis and control groups.

In contrast, risk for all cardiovascular events was 36% higher in patients with asthma than in those without, whereas risk for cerebrovascular disease and all-cause mortality were similar.

This could be the result of a difference in phenotypes in asthma patients, said Dr. Crans Yoon. People whose asthma is caused by allergies could be at less risk for cardiovascular events than people whose asthma has other causes.

Why should allergic rhinitis decrease someone's risk for death? 

Another explanation could be that the immune systems of patients with allergic rhinitis are hyperalert, aggressively fighting off disease, as well as causing symptoms, when it is not necessary. More work is needed to evaluate that.

Keep your eye on future nanosilver research. From Science Daily:

More dangerous chemicals in everyday life: Now experts warn against nanosilver

Endocrine disrupters are not the only worrying chemicals that ordinary consumers are exposed to in everyday life. Also nanoparticles of silver, found in e.g. dietary supplements, cosmetics and food packaging, now worry scientists. A new study from the University of Southern Denmark shows that nano-silver can penetrate our cells and cause damage.

Silver has an antibacterial effect and therefore the food and cosmetic industry often coat their products with silver nanoparticles. Nano-silver can be found in e.g. drinking bottles, cosmetics, band aids, toothbrushes, running socks, refrigerators, washing machines and food packagings.

"Silver as a metal does not pose any danger, but when you break it down to nano-sizes, the particles become small enough to penetrate a cell wall. If nano-silver enters a human cell, it can cause changes in the cell," explain Associate Professor Frank Kjeldsen and PhD Thiago Verano-Braga, Department of Biochemistry and Molecular Biology at the University of Southern Denmark.

The researchers examined human intestinal cells, as they consider these to be most likely to come into contact with nano-silver, ingested with food.

"We can confirm that nano-silver leads to the formation of harmful, so called free radicals in cells. We can also see that there are changes in the form and amount of proteins. This worries us," say Frank Kjeldsen and Thiago Verano-Braga.

A large number of serious diseases are characterized by the fact that there is an overproduction of free radicals in cells. This applies to cancer and neurological diseases such as Alzheimer's and Parkinson's.

Kjeldsen and Verano-Braga emphasizes that their research is conducted on human cells in a laboratory, not based on living people. They also point out that they do not know how large a dose of nano-silver, a person must be exposed to for the emergence of cellular changes.

"We don't know how much is needed, so we cannot conclude that nano-silver can make you sick. But we can say that we must be very cautious and worried when we see an overproduction of free radicals in human cells," they say.

Nano-silver is also sold as a dietary supplement, promising to have an antibacterial, anti-flu and cancer-inhibitory effect. The nano-silver should also help against low blood counts and bad skin. In the EU, the marketing of dietary supplements and foods with claims to have medical effects is not allowed. But the nano-silver is easy to find and buy online.

In the wake of the Uiversity of Southern Denmark-research, the Danish Veterinary and Food Administration now warns against taking dietary supplements with nano-silver.

Please note that 70 grams equals 2 1/2 ounces of chocolate. From Science Daily:

Why dark chocolate is good for your heart

It might seem too good to be true, but dark chocolate is good for you and scientists now know why. Dark chocolate helps restore flexibility to arteries while also preventing white blood cells from sticking to the walls of blood vessels. Both arterial stiffness and white blood cell adhesion are known factors that play a significant role in atherosclerosis. What's more, the scientists also found that increasing the flavanol content of dark chocolate did not change this effect. This discovery was published in the March 2014 issue of The FASEB Journal.

"We provide a more complete picture of the impact of chocolate consumption in vascular health and show that increasing flavanol content has no added beneficial effect on vascular health," said Diederik Esser, Ph.D., a researcher involved in the work from the Top Institute Food and Nutrition and Wageningen University, Division of Human Nutrition in Wageningen, The Netherlands. "However, this increased flavanol content clearly affected taste and thereby the motivation to eat these chocolates. So the dark side of chocolate is a healthy one."

To make this discovery, Esser and colleagues analyzed 44 middle-aged overweight men over two periods of four weeks as they consumed 70 grams of chocolate per day. Study participants received either specially produced dark chocolate with high flavanol content or chocolate that was regularly produced. Both chocolates had a similar cocoa mass content. Before and after both intervention periods, researchers performed a variety of measurements that are important indicators of vascular health. During the study, participants were advised to refrain from certain energy dense food products to prevent weight gain. Scientists also evaluated the sensory properties of the high flavanol chocolate and the regular chocolate and collected the motivation scores of the participants to eat these chocolates during the intervention.

"The effect that dark chocolate has on our bodies is encouraging not only because it allows us to indulge with less guilt, but also because it could lead the way to therapies that do the same thing as dark chocolate but with better and more consistent results," said Gerald Weissmann, M.D., Editor-in-Chief of The FASEB Journal. 

From NPR:

More Hints That Dad's Age At Conception Helps Shape A Child's Brain

Traditionally, research has focused on women's "biological clock." But in recent years, scientists have been looking more and more at how the father's age at conception might affect the baby, too. A study published Wednesday hints that age really might matter — in terms of the child's mental health.

Researchers from the University of Indiana and the Karolinska Institute found that compared with children fathered by men who were 20-24 years old, kids born to dads who were 45 or older were three times as likely to have autism and 13 times as likely to have ADHD. Kids born to older dads were also more likely to go on to develop substance abuse problems and get lower grades in school. The findings appear in JAMA Psychiatry.

To figure out how paternal age was related to children's psychiatric health, the researchers looked at millions of parents in Sweden who had children between 1973 and 2001. The researchers took into account the mother's age, as well as other demographic factors that might play a role in the child's cognitive development and mental health.

"There's a growing body of literature that suggests that advancing paternal age is associated with a host of problems," D'Onofrio tells Shots. Another study, published in JAMA Psychiatrylast month, found that the children of older fathers seemed to be at greater risk for developing schizophrenia and autism.

D'Onofrio and his colleagues paid special attention to siblings and cousins, and found that even among kids in the same extended family, a dad's age when his child was born made a difference.

The results are in line with a growing body of research linking older fatherhood with various developmental problems in children.

However, the study looks only at how paternal age and children's mental health are associated — it's a correlation, Reichenberg cautions, not a proven causal link. Scientists haven't yet determined the mechanisms of the effect. But it doesn't seem to be simply a matter of overdiagnosis among the children of older parents, the scientists say. Other research has found that as men get older, their sperm cells are more likely to contain random mutations that might, theoretically, contribute to disorders like autism in their kids.

Ultimately, men and women of all ages, he says, should remember that age is only one of many factors influencing the developing baby's health.

"The most important thing is [that] future mothers and fathers should still go ahead and have children, even if the father is older than 45 or 50," Reichenberg says. "Most of these children will be absolutely fine."

Please note: 500 g (or half a kilo) is 1 pound or about 3 1/3 cups of whole strawberries. From Science Daily:

Strawberries lower cholesterol, study suggests

A team of volunteers ate half a kilo of strawberries a day for a month to see whether it altered their blood parameters in any way. At the end of this unusual treatment, their levels of bad cholesterol and triglycerides reduced significantly, according to the analyses conducted by Italian and Spanish scientists.

Several studies had already demonstrated the antioxidant capacity of strawberries, but now researchers from the Università Politecnica delle Marche (UNIVPM, Italy), together with colleagues from the Universities of Salamanca, Granada and Seville (Spain), conducted an analysis that revealed that these fruits also help to reduce cholesterol.

The team set up an experiment in which they added 500 g of strawberries to the daily diets of 23 healthy volunteers over a month. They took blood samples before and after this period to compare data.

The results, which are published in the Journal of Nutritional Biochemistry, show that the total amount of cholesterol, the levels of low-density lipoproteins (LDL or bad cholesterol) and the quantity of triglycerides fell to 8.78%, 13.72% and 20.8% respectively. The high-density lipoprotein (HDL or good cholesterol) remained unchanged.

Eating strawberries also improved other parameters such as the general plasma lipid profile, antioxidant biomarkers (such as vitamin C or oxygen radical absorbance capacity), antihemolytic defences and platelet function. All parameters returned to their initial values 15 days after abandoning 'treatment' with strawberries.

The researcher admits that there is still no direct evidence about which compounds of this fruit are behind their beneficial effects, "but all the signs and epidemiological studies point towards anthocyanins, the vegetable pigments that afford them their red colour."

The research team confirmed in other studies that eating strawberries also protects against ultraviolet radiation, reduces the damage that alcohol can have on the gastric mucosa, strengthens erythrocytes, or red blood cells, and improves the antioxidant capacity of the blood.

Acetaminophen was the one nonprescription medication that for decades pregnant women thought was safe to take. Looks like not any more - a study found that taking acetaminophen during pregnancy was associated with hyperkinetic disorder and ADHD at age 7. And the longer it was taken during pregnancy, the stronger the association. From Science Daily:

Use of acetaminophen during pregnancy linked to ADHD in children, researchers say

Acetaminophen, found in over-the-counter products such as Excedrin and Tylenol, provides many people with relief from headaches and sore muscles. When used appropriately, it is considered mostly harmless. Over recent decades, the drug, which has been marketed since the 1950s, has become the medication most commonly used by pregnant women for fevers and pain.

In a report in the current online edition of JAMA Pediatrics,researchers from the UCLA Fielding School of Public Health show that taking acetaminophen during pregnancy is associated with a higher risk in children of attention-deficity/hyperactivity disorder and hyperkinetic disorder. The data raises the question of whether the drug should be considered safe for use by pregnant women.

ADHD, one of the most common neurobehavioral disorders worldwide, is characterized by inattention, hyperactivity, increased impulsivity, and motivational and emotional dysregulation. Hyperkinetic disorder is a particularly severe form of ADHD.

The UCLA researchers used the Danish National Birth Cohort, a nationwide study of pregnancies and children, to examine pregnancy complications and diseases in offspring as a function of factors operating in early life. The cohort focuses especially on the side effects of medications and infections. The researchers studied 64,322 children and mothers who were enrolled in the Danish cohort from 1996 to 2002. 

More than half of all the mothers reported using acetaminophen while pregnant. The researchers found that children whose mothers used acetaminophen during pregnancy were at a 13 percent to 37 percent higher risk of later receiving a hospital diagnosis of hyperkinetic disorder, being treated with ADHD medications or having ADHD-like behaviors at age 7. The longer acetaminophen was taken -- that is, into the second and third trimesters -- the stronger the associations. The risks for hyperkinetic disorder/ADHD in children were elevated 50 percent or more when the mothers had used the common painkiller for more than 20 weeks in pregnancy.

"It's known from animal data that acetaminophen is a hormone disruptor, and abnormal hormonal exposures in pregnancy may influence fetal brain development," Ritz said. Acetaminophen can cross the placental barrier, Ritz noted, and it is plausible that acetaminophen may interrupt fetal brain development by interfering with maternal hormones or through neurotoxicity, such as the induction of oxidative stress, which can cause the death of neurons.