Skip to content

The latest results on this hotly debated subject. The researchers suggest that people instead use "hands free phones with the loud speaker feature". From Medscape:

Long-Term Cell Phone Use Linked to Brain Tumor Risk

Long-term use of both mobile and cordless phones is associated with an increased risk for glioma, the most common type of brain tumor, the latest research on the subject concludes.

The new study shows that the risk for glioma was tripled among those using a wireless phone for more than 25 years and that the risk was also greater for those who had started using mobile or cordless phones before age 20 years.

The recent worldwide increase in use of wireless communications has resulted in greater exposure to radio frequency electromagnetic fields (RF-EMF). The brain is the main target of RF-EMF when these phones are used, with the highest exposure being on the same side of the brain where the phone is placed.

The analysis included 1498 cases of malignant brain tumors; the mean age was 52 years. Most patients (92%) had a diagnosis of glioma, and just over half of the gliomas (50.3%) were the most malignant variety — astrocytoma  grade IV (glioblastoma multiforme). Also included were 3530 controls, with a mean age of 54 years.

The analysis showed an increased risk for glioma associated with use for more than 1 year of both mobile and cordless phones after adjustment for age at diagnosis, sex, socioeconomic index, and year of diagnosis. The highest risk was for those with the longest latency for mobile phone use over 25 years.

The risk was increased the more that wireless phones were used. The odds ratios steadily rose with increasing hours of use...Further, the risk was highest among participants who first used a mobile phone (odds ratio, 1.8) or cordless phone (odds ratio, 2.3) before age 20 years, although the number of cases and controls was relatively small.

As Dr Hardell explained, children and adolescents are more exposed to RF-EMF than adults because of their thinner skull bone and smaller head and the higher conductivity in their brain tissue. The brain is still developing up to about the age of 20 and until that time it is relatively vulnerable, he said.

There was a higher risk for third-generation (3G) mobile phone use compared with other types, but this was based on short latency and rather low numbers of exposed participants, said the authors. 3G universal global telecommunications system mobile phones emit wide band microwave signals, which "hypothetically" may result in higher biological effects compared to other signals, they write. 

Numerous studies have looked at the link between use of wireless phones and brain tumors. Studies by Dr Hardell and his colleagues dating back to the late 1990s have found a connection with mobile and cordless phones. But the INTERPHONE study (Int J Epidemiol 2011;39:675-694; Cancer Epidemiol 2011;32:453-464) failed to find strong evidence that mobile phones increase the risk for brain tumors.

In addition, a large prospective study (Int J Epidemiol 2013;42:792-802) found that mobile phone use was not associated with increased incidence of glioma or of meningioma or non–central nervous system cancers in middle-aged British women.

Pathophysiology. Published online October 28, 2014. Abstract

This recent scientific (and yes, technical) article discusses the tantalizing promise of treating cancer, especially melanoma, with infections and certain vaccines. Much discussion of how two vaccines that are already out there may prevent some cancers such as melanoma and leukemia (vaccination with Bacille Calmette-Guerin (BCG) of newborns and vaccination with the yellow fever 17D vaccine of adults).This recent article is a further development on what was discussed in the last post (Injecting a person with a bacterial extract - called Coley's toxins or Coley toxins - to cause an infection, and so treat cancer). From BioMed Central:

The biography of the immune system and the control of cancer: from St Peregrine to contemporary vaccination strategies

In 1875 Campbell de Morgan, a surgeon at the Middlesex Hospital in London, reported that regressions and remissions of cancers sometimes occurred after post-operative infections, particularly the streptococcal infection erysipelas...

Campbell de Morgan’s observation that remissions sometimes occurred after post-operative streptococcal infections inspired some workers to undertake the risky procedure of deliberately inducing erysipelas in cancer patients. Subsequently, an American surgeon, William Coley, developed bacteria-free extracts of streptococci and other bacteria (“Coley toxins”) and reported their successful use in the therapy of cancers, especially sarcomas, between 1881 and 1936 . Unfortunately Coley, a mild mannered and unassuming gentleman, did not adhere to rigorous scientific protocols in his studies and he was marginalized by forceful personalities advocating radiotherapy. Notwithstanding, an analysis of his results with cancer deemed inoperable undertaken in 1994 revealed a remission rate of 64% and a five-year survival rate of 44%, results equal to or better than those with modern therapies [14]. 

It is also now appreciated that chronic inflammation is an essential element of cancers and it has indeed been termed ‘the other half of the tumour’ [37]. The normal healing process relies on inflammation, collagen production, angiogenesis and cell proliferation and, in a description of the similarities between tumour stroma formation and wound healing, tumours have been referred to as “wounds that do not heal” [38], 

The relationship between infection, and associated inflammation, and cancer is a complex and paradoxical one and there are several well described examples of cancer being the direct consequence of infection [41]. Around 2 million of the 12.7 million new cancer cases worldwide in 2008 (16.1%) were assumed to be related to infection, principally Helicobacter pylori, hepatitis viruses, and the human papilloma virus, with a higher proportion in developing countries (22.9%) than in developed ones (7.4%) [42]. The large majority of cases of cancer, especially those in the developed nations, are therefore not caused by infection – on the contrary, there is growing evidence that a history of certain infections and environmental exposure to certain populations of micro-organisms, as well as some types of vaccination, may induce patterns of immune reactivity that reduce the risk of at least some cancers

A study of an adult population in Italy demonstrated an association between a history of common childhood infectious diseases (measles, chickenpox, rubella, mumps and pertussis) and the risk of developing chronic lymphatic leukaemia (CLL), with a strong inverse relationship between the risk of CLL and the number of infections (p = 0.002) [47]. 

In the 1990s Kölmel and colleagues established a working group – Febrile Infections and Melanoma (FEBIM) – within the European Organization for Research and Treatment of Cancer (EORTC). Based on a pilot study [79] this group undertook a series of studies to establish the relationship between the risk for developing melanoma and a history of, initially, infectious diseases [80], and, subsequently, also of vaccinations [81,82].

In the first report of the FEBIM group a significant level of protection against melanoma in those with a history of certain severe infections (sepsis, Staph. aureus infection, pneumonia, pulmonary tuberculosis) with fever of over 38.5°C was demonstrated [80]. It should, however, be noted that these apparently melanoma-protective infectious diseases have become rare in the industrialized nations. 

It is claimed that, as a result of recent observational studies, measures for prevention of some malignancies such as melanoma and certain forms of leukaemia are already at hand: vaccination with Bacille Calmette-Guérin (BCG) of new-borns and vaccination with the yellow fever 17D (YFV) vaccine of adults. While the evidence of their benefit for prevention of malignancies requires substantiation, the observations that vaccinations with BCG and/or vaccinia early in life improved the outcome of patients after surgical therapy of melanoma are of practical relevance as the survival advantage conferred by prior vaccination is greater than any contemporary adjuvant therapy.

Yes! An approach to ADHD that makes sense. Nice piece from Richard A. Friedman, professor of clinical psychiatry and director of the psychopharmacology clinic at the Weill Cornell Medical College. From NY Times:

A Natural Fix for A.D.H.D.

Attention deficit hyperactivity disorder is now the most prevalent psychiatric illness of young people in America, affecting 11 percent of them at some point between the ages of 4 and 17. The rates of both diagnosis and treatment have increased so much in the past decade that you may wonder whether something that affects so many people can really be a disease.

And for a good reason. Recent neuroscience research shows that people with A.D.H.D. are actually hard-wired for novelty-seeking — a trait that had, until relatively recently, a distinct evolutionary advantage. Compared with the rest of us, they have sluggish and underfed brain reward circuits, so much of everyday life feels routine and understimulating.

To compensate, they are drawn to new and exciting experiences and get famously impatient and restless with the regimented structure that characterizes our modern world. In short, people with A.D.H.D. may not have a disease, so much as a set of behavioral traits that don’t match the expectations of our contemporary culture.

From the standpoint of teachers, parents and the world at large, the problem with people with A.D.H.D. looks like a lack of focus and attention and impulsive behavior. But if you have the “illness,” the real problem is that, to your brain, the world that you live in essentially feels not very interesting.The more novel and unpredictable the experience, the greater the activity in your reward center. But what is stimulating to one person may be dull — or even unbearably exciting — to another. There is great variability in the sensitivity of this reward circuit.

These findings suggest that people with A.D.H.D are walking around with reward circuits that are less sensitive at baseline than those of the rest of us. Having a sluggish reward circuit makes normally interesting activities seem dull and would explain, in part, why people with A.D.H.D. find repetitive and routine tasks unrewarding and even painfully boring.

Another patient of mine, a 28-year-old man, was having a lot of trouble at his desk job in an advertising firm. Having to sit at a desk for long hours and focus his attention on one task was nearly impossible. He would multitask, listening to music and texting, while “working” to prevent activities from becoming routine. Eventually he quit his job and threw himself into a start-up company, which has him on the road in constantly changing environments. He is much happier and — little surprise — has lost his symptoms of A.D.H.D.

My patient “treated” his A.D.H.D simply by changing the conditions of his work environment from one that was highly routine to one that was varied and unpredictable. All of a sudden, his greatest liabilities — his impatience, short attention span and restlessness — became assets. And this, I think, gets to the heart of what is happening in A.D.H.D.

Consider that humans evolved over millions of years as nomadic hunter-gatherers. It was not until we invented agriculture, about 10,000 years ago, that we settled down and started living more sedentary — and boring — lives. As hunters, we had to adapt to an ever-changing environment where the dangers were as unpredictable as our next meal. In such a context, having a rapidly shifting but intense attention span and a taste for novelty would have proved highly advantageous in locating and securing rewards — like a mate and a nice chunk of mastodon. In short, having the profile of what we now call A.D.H.D. would have made you a Paleolithic success story.

So if you are nomadic, having a gene that promotes A.D.H.D.-like behavior is clearly advantageous (you are better nourished), but the same trait is a disadvantage if you live in a settled context.

You may wonder what accounts for the recent explosive increase in the rates of A.D.H.D. diagnosis and its treatment through medication. The lifetime prevalence in children has increased to 11 percent in 2011 from 7.8 percent in 2003 — a whopping 41 percent increase — according to the Centers for Disease Control and Prevention. And 6.1 percent of young people were taking some A.D.H.D. medication in 2011, a 28 percent increase since 2007. Most alarmingly, more than 10,000 toddlers at ages 2 and 3 were found to be taking these drugs, far outside any established pediatric guidelines.

Some of the rising prevalence of A.D.H.D. is doubtless driven by the pharmaceutical industry, whose profitable drugs are the mainstay of treatment. Others blame burdensome levels of homework, but the data show otherwise. Studies consistently show that the number of hours of homework for high school students has remained steady for the past 30 years.

I think another social factor that, in part, may be driving the “epidemic” of A.D.H.D. has gone unnoticed: the increasingly stark contrast between the regimented and demanding school environment and the highly stimulating digital world, where young people spend their time outside school. Digital life, with its vivid gaming and exciting social media, is a world of immediate gratification where practically any desire or fantasy can be realized in the blink of an eye. By comparison, school would seem even duller to a novelty-seeking kid living in the early 21st century than in previous decades, and the comparatively boring school environment might accentuate students’ inattentive behavior, making their teachers more likely to see it and driving up the number of diagnoses.

Perhaps one explanation is that adults have far more freedom to choose the environment in which they live and the kind of work they do so that it better matches their cognitive style and reward preferences. If you were a restless kid who couldn’t sit still in school, you might choose to be an entrepreneur or carpenter, but you would be unlikely to become an accountant. 

Lasting benefits from lifestyle changes (Mediterranean diet and exercise). From Science Daily:

Mediterranean diets have lasting health benefits

The health benefits of switching to a Mediterranean style diet and upping the amount of time spent exercising for a period of just eight weeks can still be seen a year after stopping the regime, a new study has shown.

The research by Sheffield Hallam University and the University of Lincoln in the UK revealed that the diet and exercise combination leads to improved blood flow in cells in the inner lining of the blood vessels -- called the endothelial cells -- a full 12 months after completing participation in the intervention programme.

Endothelial cells line the interior of the entire vascular system of the human body -- from the large arteries to the smallest capillaries -- and improvements in their function could reduce the risk of people developing cardiovascular disease, the study said.

Researchers believe the long-term health benefits observed after such a short intervention could be due to molecular changes associated with the Mediterranean diet. Traditional Mediterranean cuisine is based on olive oil, fruit, vegetables and salad, fish, legumes, whole grain foods, wine and limited consumption of red meat.

The study focused on healthy people over the age of 50. Participants were originally assessed over an eight-week period.One group was encouraged to eat more vegetables, fruit, olive oil, tree nuts and fresh oily fish, as well as take up a moderate exercise regime, while the other just took up exercise alone.

The results showed more health improvements in the Mediterranean diet group than the exercise only group, which one year later, were still evident despite the lifestyle changes implemented during the study no longer being carefully followed.

1

I think many will say: Oh no! Totally vegan is best for weight loss?? From Science Daily:

Vegan diet best for weight loss even with carbohydrate consumption, study finds

People shed more weight on an entirely plant based diet, even if carbohydrates are also included, a study has concluded. Other benefits of eating a vegan diet include decreased levels of saturated and unsaturated fat, lower BMIs, and improved macro nutrients.

The study, conducted by the university's Arnold School of Public Health and published in The International Journal of Applied and Basic Nutritional Sciences, compared the amount of weight lost by those on vegan diets to those on a mostly plant-based diet, and those eating an omnivorous diet with a mix of animal products and plant based foods. At the end of six months, individuals on the vegan diet lost more weight than the other two groups by an average of 4.3%, or 16.5 pounds.

The study followed participants who were randomly assigned to one of five diets on the dietary spectrum: vegan which excludes all animal products, semi-vegetarian with occasional meat intake; pesco-vegetarian which excludes all meat except seafood; vegetarian which excludes all meat and seafood but includes animal products, and omnivorous, which excludes no foods.

Participants followed their assigned dietary restrictions for six months, with all groups except the omnivorous participating in weekly group meetings. Those who stuck to the vegan diet showed the greatest weight loss at the two and six month marks.The lead author on this study, Gabrielle Turner-McGrievy notes that the diet consumed by vegan participants was high in carbohydrates that rate low on the glycemic index

Think lifestyle changes, not medications. From Medical Daily:

Mild Hypertension Should Be Treated With Advice On Lifestyle Changes, Not Medication

In 2013, Dr. Iona Heath, a retired general practitioner published an article in the Journal of  the American Medical Association, in which she spoke about the side effects of overtreatment and overdiagnosis of mild hypertension. Now, in a new study, researchers revisit this idea, saying that unnecessary treatment of mild hypertension in low-risk patients is harming them and putting a burden on health care resources. They also argue that there's a need to reexamine criteria for diagnosing hypertension and treating blood pressure. 

About 40 percent of the world’s population, including 67 million American adults, have hypertension. Over half are classified as having mild hypertension. 

More than half of people with mild hypertension are treated with drugs, but there has been no evidence to suggest that blood pressure-lowering drugs prevent heart attacks. Instead of prescribing drugs to control mild hypertension, the authors urge clinicians to recommend healthier lifestyles to patients, which include exercising, quitting smoking, and decreased alcohol consumption. They also urge clinics to improve the accuracy of blood pressure-measuring instruments and to inform patients about measuring blood pressure at home. 

From Medical Xpress:

Experts raise concern over unnecessary treatment of mild hypertension in low risk people

Lowering the drug threshold for high blood pressure has exposed millions of low-risk people around the world to drug treatment of uncertain benefit at huge cost to health systems, warn US experts in BMJ today. Dr Stephen Martin and colleagues argue that this strategy is failing patients and wasting healthcare resources.

Over half of people with mild hypertension are treated with medication. Yet treating low risk mildly hypertensive patients with drugs has not been proven to reduce cardiovascular disease or death. The authors argue that overemphasis on drug treatment "risks adverse effects, such as increased risk of falls, and misses opportunities to modify individual lifestyle choices and tackle lifestyle factors at a public health level."

And for those over 65 the levels can be even higher. From Science News:

'Mild' control of systolic blood pressure in older adults is adequate: 150 is good enough

A broad review of the use of medications to reduce blood pressure has confirmed that "mild" control of systolic pressure is adequate for adults age 65 or older -- in the elderly, there's no clear benefit to more aggressive use of medications to achieve a lower pressure. Historically, most medical practitioners tried to achieve control of systolic pressure -- the higher of the two blood pressure readings -- to 140 or less. Recently changed guidelines now suggest that for adults over 60, keeping the systolic pressure at 150 or less is adequate, and this extensive analysis confirms that.

Take note: research has linked a lack of microbial diversity in human guts to various diseases. A solution: Eat more plants! From Science Daily:

Compared with apes, people's gut bacteria lack diversity, study finds

The microbes living in people's guts are much less diverse than those in humans' closest relatives, the African apes, an apparently long evolutionary trend that appears to be speeding up in more modern societies, with possible implications for human health, according to a new study.

Based on an analysis of how humans and three lineages of ape diverged from common ancestors, researchers determined that within the lineage that gave rise to modern humans, microbial diversity changed slowly and steadily for millions of years, but that rate of change has accelerated lately in humans from some parts of the world.

People in nonindustrialized societies have gut microbiomes that are 60 percent different from those of chimpanzees. Meanwhile, those living in the U.S. have gut microbiomes that are 70 percent different from those of chimps.

 "On the other hand, in apparently only hundreds of years -- and possibly a lot fewer -- people in the United States lost a great deal of diversity in the bacteria living in their gut."

That rapid change might translate into negative health effects for Americans. Previous research has shown that compared with several populations, people living in the U.S. have the lowest diversity of gut microbes. Still other research has linked a lack of microbial diversity in human guts to various diseases such as asthma, colon cancer and autoimmune diseases.

One possible explanation for humans evolving to have less diversity in their gut microbiomes is that they shifted to a diet with more meat and fewer plants. Plants require complex communities of microbes to break them down, which is not as true for meat.

As for why Americans have experienced much more rapid changes in microbial diversity compared with people in less industrialized societies, some experts have suggested more time spent indoors, increased use of antibacterial soaps and cleaners, widespread use of antibiotics and high numbers of births by Cesarean section all may play a role. Antibiotics and antimicrobial cleaners can kill good bacteria along with the bad, and C-section deliveries prevent babies from receiving certain bacteria from the mother typically conferred during vaginal births.

An amazing and unforgettable story of a man researching the gut microbes that are increasingly lost in developed Westernized populations. And do go read the original story (see link).From Popular Science:

Scientist Gives Himself Fecal Transplant To Try A Hunter-Gatherer's Microbiome

Why a field researcher from America has exposed his colon to the gut microbiome of a tribesman from Tanzania.

It's not often we encounter a story that begins with a line like this: “AS THE SUN set over Lake Eyasi in Tanzania, nearly thirty minutes had passed since I had inserted a turkey baster into my bum and injected the feces of a Hadza man – a member of one of the last remaining hunter-gatherer tribes in the world – into the nether regions of my distal colon.”

The guy behind this essay, Jeff Leach, is part of a multi-national scientific research team that by his account has been living with the Hadza, hunter-gatherers in Tanzania, for over a year. They have collected hundreds of samples from humans, animals, and the environment in order to observe how the microbial communities in and around the Hadza change with the dramatic seasonal weather shifts in East Africa: six months of near-steady rain followed by six dry months.

The question driving the research is “what a normal or healthy microbiome might have looked like before the niceties and medications of late whacked the crap out of our gut bugs in the so-called modern world,” Leach writes.

The Hadza are contemporary people, Leach writes, not an undiscovered stone-age civilization. But they're excellent subjects for this research because they still live on plant and animal foods that humans have hunted and gathered for millennia, and their use of western medications is extremely limited.

The health impacts of what lives (or doesn't) in our guts are getting increased attention in Western dietary and medical circles -- and eating foods containing "probiotics" just scratches the surface. Recent research suggests that use of antibiotics may be fundamentally altering our gut biomes for the worse, increasing rates of allergies, asthma and weight gain.

As for fecal transplants, they're no longer career killers in polite medical conversation. Swapping poop from healthy to sick persons is now an up-and-up treatment for curing chronic gastrointestinal disease. The launch of the OpenBiome fecal transplant bank in the U.S. earlier this year seems to signal that the technique is going mainstream.

As for Jeff Leach, he describes his primary scientific motivation for self-administering a fecal transplant as testing the hypothesis "of microbial extinction, something I believe we all suffer from in the western world and may be at the root of what’s making us sick." The biggest change Leach and his girlfriend have noticed since the transplant is that he's passing a lot less gas. 

Read the rest of his very readable, informative and down-to-earth essay: (Re)Becoming Human: what happened the day I replaced 99% of the genes in my body with that [sic] of a hunter-gatherer.

It is estimated that between 14,000 to 30,000 Americans die each year from Clostridium difficile infections. So finding a bacteria that could protect people from C. difficile is a big deal. However, it is only one bacteria, and sick people typically are depleted of a microbial community, not just one bacteria. From Science News:

Harmless bacterium edges out intestinal germ

Gut infections from the bacterium Clostridium difficile can be fought with a closely related but harmless microbe known as C. scindens. The friendly bacterium combats infection in mice by converting molecules produced in the liver into forms that inhibit C. difficile growth,researchers report October 22 in Nature.

C. scindens also appears to protect people from infection, the researchers found in a preliminary study in humans. The new findings could begin a path to the next generation of therapies using gut bacteria, says Alexander Khoruts, a gastroenterologist at the University of Minnesota in Minneapolis.

People who become infected with C. difficile typically have taken antibiotics, which wipe out the beneficial microbes in the gut, giving C. difficile a chance to take root. The infection can lead to cramps, diarrhea and even death. An estimated 500,000 to 1 million people get C. difficile infections each year in the United States. People with C. difficile receive more antibiotics to treat the infection or a fecal transplant to restore healthy microbes to the gut.

Several research groups have been trying to identify gut bacteria that are resilient in the face of C. difficile so that physicians can give patients those bacteria as a treatment, says Eric Pamer, an immunologist at Memorial Sloan Kettering Cancer Center. Single strains of bacteria such as C. scindens would offer significant advantages over fecal transplants: With a transplant, doctors screen the donated feces for pathogens that might sicken the recipient. But, Pamer says, “there are many things, viruses that have yet to be identified, that could be in a crude fecal product that might cause trouble.”  

Pamer and his team gave mice antibiotics to deplete beneficial microbes but not wipe them out completely. The researchers then fed the mice C. difficile spores and identified microbes that appeared in mice with lower amounts of C. difficile in their guts. C. scindens was the clear victor. It is harmless and present in most people, but in very low numbers.

The researchers also examined the microbial populations of 24 patients undergoing stem cell transplants. Those patients had lowered microbial diversity after receiving combinations of antibiotics, radiation and chemotherapy. The patients who didn’t develop C. difficile after the transplant were more likely to have C. scindens in their guts.

The researchers also investigated how C. scindens combats C. difficileC. difficile begins growing after it is exposed to certain molecules secreted in bile after a meal. However, another form of the molecule inhibits C. difficile growth. C. scindens transforms the molecule from one form to the other, boosting resistance to C. difficile.  

Again,the same message: get moving for health, including cognitive function. From Science Daily:

To reap the brain benefits of physical activity, just get moving

Everyone knows that exercise makes you feel more mentally alert at any age. But do you need to follow a specific training program to improve your cognitive function? Science has shown that the important thing is to just get moving. It's that simple.

The study compared the effects of different training methods on the cognitive functions of people aged 62 to 84 years. Two groups were assigned a high-intensity aerobic and strength-training program, whereas the third group performed tasks that targeted gross motor activities (coordination, balance, ball games, locomotive tasks, and flexibility). While the aerobics and strength-training were the only exercises that led to physical fitness improvements after 10 weeks (in terms of body composition, VO2 max, and maximum strength), all three groups showed equivalent improvement in cognitive performance.

The subjects in the third group performed activities that can easily be done at home, which is excellent news for sedentary people who can't see themselves suddenly going to a gym to work out. To improve your cognitive health, you can simply start by doing any activity you like. 

"Our study targeted executive functions, or the functions that allow us to continue reacting effectively to a changing environment. We use these functions to plan, organize, develop strategies, pay attention to and remember details, and manage time and space," explained Dr. Louis Bherer, PhD.

"For a long time, it was believed that only aerobic exercise could improve executive functions. More recently, science has shown that strength-training also leads to positive results. Our new findings suggest that structured activities that aim to improve gross motor skills can also improve executive functions, which decline as we age. I would like seniors to remember that they have the power to improve their physical and cognitive health at any age and that they have many avenues to reach this goal," concluded Dr. Nicolas Berryman, PhD.