Jump to content

Al's papers' citations and possibly links and excerpts or my synopses

Recommended Posts

The othering of old age: Insights from Postcolonial Studies.

van Dyk S.

J Aging Stud. 2016 Dec;39:109-120. doi: 10.1016/j.jaging.2016.06.005.

PMID: 27912849


When it comes to old age, we are witnessing almost revolutionary changes at the present time. After decades of ignorance and lack of public interest, old age has fundamentally been re-negotiated. A diverse range of authors have diagnosed the growing bifurcation of old age into a rather independent and capable Third Age and a deep old Fourth Age that is characterized by sickness, frailty and dependency. Against this backdrop, many gerontologists claim that the so-called young-old are praised and valued for their (ongoing) "sameness" in terms of midlife-norms and capabilities, whereas the oldest old are increasingly excluded from humanity by radical "othering". Taking up this diagnosis, the article elaborates on this growing polarization within later life: Based on empirical research on the re-negotiation of old age in Germany, this contribution argues that the juxtaposition of "sameness" and "otherness" obscures the true character of the polarization, particularly with regard to the social role of the Third Age. Instead of sameness and otherness, we rather witness different processes of othering, with the young-old being valued as the other and the oldest old being disdained as the other. Despite the existence of profound critical analyses of the abjection associated with the Fourth Age as well as a considerable amount of literature on old age activation and the new role of the young-old, the specific point of this article's concern-the othering of the Third Age-has been completely neglected. The article discusses the reasons for this gap in more detail and will indicate to what extent concepts from Postcolonial Studies may help us to understand the dual process of othering-glorification and abjection.


Active aging; Critical gerontology; Fourth age; Othering; Postcolonial theory; Third age


Self-reported visual impairment, physical activity and all-cause mortality: The HUNT Study.

Brunes A, Flanders WD, Augestad LB.

Scand J Public Health. 2016 Dec 1. pii: 1403494816680795. [Epub ahead of print]

PMID: 27913690



To examine the associations of self-reported visual impairment and physical activity (PA) with all-cause mortality.


This prospective cohort study included 65,236 Norwegians aged ⩾20 years who had participated in the Nord-Trøndelag Health Study (HUNT2, 1995-1997). Of these participants, 11,074 (17.0%) had self-reported visual impairment (SRVI). The participants' data were linked to Norway's Cause of Death Registry and followed throughout 2012. Hazard ratios and 95% confidence intervals (CI) were assessed using Cox regression analyses with age as the time-scale. The Cox models were fitted for restricted age groups (<60, 60-84, ⩾85 years).


After a mean follow-up of 14.5 years, 13,549 deaths were identified. Compared with adults with self-reported no visual impairment, the multivariable hazard ratios among adults with SRVI were 2.47 (95% CI 1.94-3.13) in those aged <60 years, 1.22 (95% CI 1.13-1.33) in those aged 60-84 years and 1.05 (95% CI 0.96-1.15) in those aged ⩾85 years. The strength of the associations remained similar or stronger after additionally controlling for PA. When examining the joint associations, the all-cause mortality risk of SRVI was higher for those who reported no PA than for those who reported weekly hours of PA. We found a large, positive departure from additivity in adults aged <60 years, whereas the departure from additivity was small for the other age groups. 



All-cause mortality; HUNT study; physical activity; prospective cohort study; self-reported; visual impairment


Natural Remedies to Dissolve Kidney Stones

Posted on Dec. 2, 2016, 6 a.m. in Alternative Medicine Functional Foods

Here are some natural treatments that may help prevent or soothe the discomfort of kidney stones and facilitate the healing process.

 Natural Remedies to Dissolve Kidney Stones

Anyone who has ever passed kidney stones can relate in excruciating detail just how painful that process can be. People suffering from this event often end up in the emergency room desperately seeking any medical relief available. If the kidney stones do not pass and the pain does not subside, a CT scan may be needed to determine the size and number of stones present. A relatively new procedure called a lithotripsy uses a tiny laser to break up the stones in a non-invasive medical procedure, but it is not recommended unless the stones are larger than 8 mm in size.

As with many other health problems, prevention is a better course of action than treatment after the disease has developed. The major cause of kidney stones is an unhealthy diet and inadequate hydration. The standard American diet that is high in sugar, sodium and animal products result in a buildup of toxins. Proper hydration could help eliminate some of these toxins that contribute to the formation of kidney stones.

Fortunately, the size and number of kidney stones can be reduced considerably through a variety of natural treatments. Switching to a healthier diet and increasing fluid intake is the first step, but it is important that specific foods and nutrients be included to not only prevent, but also dissolve or pulverize the stones.

Magnesium - Many Americans are deficient in this beneficial nutrient. Calcium supplementation is promoted by the health industry to support strong bones and combat osteoporosis, but calcium should be balanced by proper magnesium supplementation which is needed for every bodily function. Studies have shown that magnesium can reduce kidney stones about 92 percent.

Dandelion Root - It is possible to pull a dandelion root out of the ground (easier when the ground is soaked), clean it and make tea. Most people would prefer to simply buy some dried organic dandelion root that is available at health stores as a tea, extract or capsule. This root works to cleanse the kidneys as well as the whole body.

Pomegranate juice - While pomegranates are delicious, the juice is sour and has astringent qualities that support healthy kidneys. The juice can be used to make a healthy smoothie to help break down kidney stones. Organic juice is always best whatever the variety.

Basil - Basil is a popular ingredient in many recipes, and the herb is another plant that is beneficial in preventing and reducing kidney stones. Basil can be made into a tea that is a great kidney toner, and a teaspoon of basil juice with raw honey is an effective treatment to help clear up kidney stones.

 Roller Coaster - While riding a roller coaster is an exciting experience, the jolts and vibrations of a roller coaster ride can also cause a kidney stone to pass. This does not always happen, but studies show stones can be jostled into position to help them pass. Anyone in pain from kidney stones may be willing to try anything; even a scary roller coaster ride.

Modern medical technology is wonderful and beneficial in many ways, but the medical community does not necessarily think in terms of natural treatments. Many health issues including kidney stones can be avoided or treated effectively by eating a healthy diet and supplementing with nutrients that are lacking in the standard American diet. Many prescription drugs are derived from plants found in nature, and many other plants can be used in their natural state. It is always important to check with a health provider before choosing to begin using any alternate treatment for a health problem. 


Simple Secret to Sleep

Posted on Dec. 2, 2016, 6 a.m. in Sleep Alternative Medicine Brain and Mental Performance

What one technique may improve sleep quality and reduce insomnia and fatigue, among older men and women?

 Simple Secret to Sleep

Woman sleeping - image from Shutterstock

As we age, we typically experience declines in the quality of our sleep. Mindfulness meditation is a self-administered approach that intentionally focuses one's attention on the emotions, thoughts and sensations occurring in the present moment. David Black, from University of Southern California (California, USA), and colleagues enrolled 49 men and women, ages 55 years and older, who experienced moderately (or greater) disturbed sleep, who were divided into two groups. One group visited the study center for six weekly two-hour sessions of a course in Mindfulness Awareness Practices for daily living. Those included meditation, eating, walking, movement and friendly or loving-kindness practices. A certified teacher led the exercises and also instructed participants to meditate for five minutes daily, gradually increasing to 20 minutes daily. The other group attended six weeks of a sleep hygiene and education course, where they learned about sleep problems, and self-care methods for improving sleep, and weekly behavioral sleep hygiene strategies.  Prior to the start of the six-week programs, the average sleep quality questionnaire score was 10.

At the end of the study period, those in the meditation group demonstrated improvement in their sleep score by an average of 2.8 points, compared to 1.1 points in the sleep hygiene group. Among those in the meditation group, daytime impairments, including symptoms of insomnia, fatigue and depression, were improved as well. The study authors conclude that: "Formalized mindfulness-based interventions have clinical importance by possibly serving to remediate sleep problems among older adults in the short term, and this effect appears to carry over into reducing sleep-related daytime impairment that has implications for quality of life."


Mindfulness meditation and improvement in sleep quality and daytime impairment among older adults with sleep disturbances: a randomized clinical trial.

Black DS, O'Reilly GA, Olmstead R, Breen EC, Irwin MR.

JAMA Intern Med. 2015 Apr;175(4):494-501. doi: 10.1001/jamainternmed.2014.8081.

PMID: 25686304 Free PMC Article



Sleep disturbances are most prevalent among older adults and often go untreated. Treatment options for sleep disturbances remain limited, and there is a need for community-accessible programs that can improve sleep.


To determine the efficacy of a mind-body medicine intervention, called mindfulness meditation, to promote sleep quality in older adults with moderate sleep disturbances.


Randomized clinical trial with 2 parallel groups conducted from January 1 to December 31, 2012, at a medical research center among an older adult sample (mean [sD] age, 66.3 [7.4] years) with moderate sleep disturbances (Pittsburgh Sleep Quality Index [PSQI] >5).


A standardized mindful awareness practices (MAPs) intervention (n = 24) or a sleep hygiene education (SHE) intervention (n = 25) was randomized to participants, who received a 6-week intervention (2 hours per week) with assigned homework.


The study was powered to detect between-group differences in moderate sleep disturbance measured via the PSQI at postintervention. Secondary outcomes pertained to sleep-related daytime impairment and included validated measures of insomnia symptoms, depression, anxiety, stress, and fatigue, as well as inflammatory signaling via nuclear factor (NF)-κB.


Using an intent-to-treat analysis, participants in the MAPs group showed significant improvement relative to those in the SHE group on the PSQI. With the MAPs intervention, the mean (SD) PSQIs were 10.2 (1.7) at baseline and 7.4 (1.9) at postintervention. With the SHE intervention, the mean (SD) PSQIs were 10.2 (1.8) at baseline and 9.1 (2.0) at postintervention. The between-group mean difference was 1.8 (95% CI, 0.6-2.9), with an effect size of 0.89. The MAPs group showed significant improvement relative to the SHE group on secondary health outcomes of insomnia symptoms, depression symptoms, fatigue interference, and fatigue severity (P < .05 for all). Between-group differences were not observed for anxiety, stress, or NF-κB, although NF-κB concentrations significantly declined over time in both groups (P < .05).


The use of a community-accessible MAPs intervention resulted in improvements in sleep quality at immediate postintervention, which was superior to a highly structured SHE intervention. Formalized mindfulness-based interventions have clinical importance by possibly serving to remediate sleep problems among older adults in the short term, and this effect appears to carry over into reducing sleep-related daytime impairment that has implications for quality of life.


[sci-Hub did not seem to work for the pdf-availed below paper.]

Being mindful of later-life sleep quality and its potential role in prevention.

Spira AP.

JAMA Intern Med. 2015 Apr;175(4):502-3. doi: 10.1001/jamainternmed.2014.8093. No abstract available.

PMID: 25686155

Older adults commonly report disturbed sleep, and an expanding literature suggests that poor sleep increases the risk of adverse health outcomes.1 In this issue of JAMA Internal Medicine, Black et al2 present a randomized clinical trial (RCT) among adults 55 years and older with moderately disturbed sleep, comparing a sleep hygiene intervention with a community-based mindfulness meditation intervention. They assess the outcomes of global sleep quality, insomnia symptoms, fatigue, and depressive symptoms. As the authors explain, effective nonpharmacological interventions that are both “scalable” and “community accessible” are needed to improve disturbed sleep and prevent clinical levels of insomnia. This is imperative given links between insomnia and poor health outcomes, risks of sleep medication use, and the limited availability of health care professionals trained in effective nondrug treatments such as behavior therapy and cognitive behavioral therapy for insomnia. This context makes the positive results of this RCT compelling.

Behavior therapy and cognitive behavioral therapy for insomnia are nonpharmacological interventions for clinical levels of insomnia, with known efficacy across age groups, including in older adults. However, awareness of these therapies is low among physicians and the general public, and the availability of trained health care professionals to administer these treatments is below what is required to address the clinical need. Fortunately, initiatives to more widely disseminate these interventions have begun. The Department of Veterans Affairs has been training non–sleep specialist physicians, psychologists, nurses, and social workers to deliver cognitive behavioral therapy for insomnia within its primary care, mental health, and other clinics.3 Outside of the Department of Veterans Affairs, a nurse-administered brief behavioral treatment for insomnia has been developed for use in primary care.4 Although these are positive developments, such treatments are not yet widely available in primary care settings, rendering the outside-of-the-clinic approach of Black et al2 to treatment (and perhaps prevention) of disturbed sleep all the more relevant.

There are some important methodological aspects of the study by Black et al2 that warrant discussion. The authors did not require participants in their RCT to meet criteria for a particular common sleep disorder such as insomnia, sleep-disordered breathing, or restless legs syndrome. Instead, they excluded individuals reporting sleep-disordered breathing or restless legs syndrome and recruited individuals with a self-reported Pittsburgh Sleep Quality Index (PSQI) exceeding 5, which suggests poor sleep quality.5 Sleep quality is a broad construct that has been defined in various ways. It encompasses and is affected by several aspects of sleep, including sleep fragmentation, extended latency to sleep onset, perceived restfulness of sleep, and (according to some definitions) sleep duration.5 The cutoff value of greater than 5 on the PSQI was originally selected for its ability to differentiate poor sleepers (defined as individuals with problems falling or staying asleep, excessive daytime sleepiness, or depression) from good sleepers with no sleep complaints.5 Some may question the decision to recruit study participants on the basis of a PSQI score rather than, for example, an insomnia diagnosis. However, the selection of the PSQI makes sense given the epidemiological evidence that among older adults a PSQI exceeding 5 is associated with adverse health outcomes, including frailty and lower cognitive function.6,7 Therefore, the PSQI cutoff of greater than 5 should identify individuals at risk for poor sleep-related health outcomes and would likely capture a greater number of them than would be identified by more stringent diagnostic criteria for insomnia. These are important qualities for a community-based intervention in which prevention of clinical insomnia and its outcomes may eventually be the goal. In line with this, while Black et al2 describe their study participants as having moderate sleep problems, a PSQI exceeding 5 can also reflect severe disturbances. Therefore, their sample likely included older adults with subclinical insomnia and others with clinical levels of insomnia. In addition, there may well have been cases of sleep-disordered breathing and restless legs syndrome in the sample given that the history of these disturbances was screened for only via self-report. The likelihood that their sample contained individuals with sleep disturbances other than insomnia and that their sleep may have been more than moderately disturbed makes the authors’ positive findings all the more impressive.

This excellent study raises some questions that need to be answered in future research. For example, which aspects of the broad construct of sleep quality does the mindfulness meditation intervention improve? While the global PSQI score was the primary outcome of this trial, how does the intervention affect the different PSQI components,5 including sleep quality, sleep latency, sleep duration, sleep efficiency, sleep disturbances, sleep medications, and daytime dysfunction? In addition, the mechanisms by which mindfulness meditation affects sleep remain unclear. Indeed, depression and fatigue were classified by Black et al2 as secondary “sleep-related daytime impairment” outcomes and were improved by the mindfulness intervention. However, rather than being consequences of improvements in insomnia, it is possible that improvements in depression and fatigue mediated the effect of mindfulness on sleep quality. Future studies with repeated assessment of these putative mediators throughout the intervention would help determine whether this is the case. Finally, the authors indicated that their RCT was the first study to date of the effect on sleep of a meditation modality that did not require movement. This would make it of particular usefulness with older adults experiencing mobility limitations. Future studies of mindfulness meditation for sleep improvement are needed among older adults with restricted mobility.

While mindfulness meditation shows promise as a nonpharmacological, community-based means of improving or preventing insomnia among older persons, additional approaches are needed. Indeed, a menu of appealing options is necessary to compete with the quicker fix offered by the various sleep medications available to older adults. Volunteerism is an approach that appears worthy of evaluation. Prior observational investigations indicate that volunteerism is associated with improved physical and mental health outcomes.8 Intensive, meaningful, physically active, and socially engaging volunteerism is being examined in an RCT as a community-based health promotion program for older adults.9 Specifically, investigators in the Baltimore Experience Corps® study9 randomized older adults (1) to a waiting list control condition or (2) to receive training in supporting children’s learning and managing classroom behavior and spend at least 15 hours per week volunteering in Baltimore City elementary schools as part of a cohort of 15 to 20 older participants. The goal is to determine whether leveraging older adults’ desire to give back to their communities (ie, generativity) and providing this enriched volunteering opportunity promotes their physical and cognitive health and the academic success of students. The Baltimore Experience Corps intervention was expected to improve health by promoting physical, social, and cognitive activity among participants, and each of these forms of activity has been linked to better sleep in other studies.10- 12 However, the effect of volunteerism on older adults’ sleep is understudied, and sleep was not a planned outcome of the Baltimore Experience Corps study. Nonetheless, volunteering—especially intensive and physically, cognitively, and socially engaging volunteering—appears to be an open area for investigation as a community-based intervention that may improve sleep or prevent insomnia and its outcomes, and provide other potential benefits for older adults and the organizations to which they contribute their time.

In summary, Black et al2 are to be applauded for their intriguing study. Other community-based nonpharmacological interventions are needed that improve sleep and perhaps prevent insomnia among older adults. Such interventions may have a key role in safely reducing the morbidity associated with disturbed sleep in later life.



Destroying worn-out cells makes mice live longer

Elegant experiment confirms that targeting senescent cells could treat age-related diseases.

Ewen Callaway

03 February 2016

Jan Van Deursen

Two littermates, almost 2 years old; the mouse on the right had its senescent cells cleared by a drug from 1 year of age onwards.

Eliminating worn-out cells extends the healthy lives of lab mice — an indication that treatments aimed at killing off these cells, or blocking their effects, might also help to combat age-related diseases in humans.

As animals age, cells that are no longer able to divide — called senescent cells — accrue all over their bodies, releasing molecules that can harm nearby tissues. Senescent cells are linked to diseases of old age, such as kidney failure and type 2 diabetes.

Nature Podcast

Ewen Callaway talks to researcher Darren Baker about senescent cells and prolonging life.

To test the cells’ role in ageing, Darren Baker and Jan van Deursen, molecular biologists at the Mayo Clinic in Rochester, Minnesota, and their colleagues engineered mice so that their senescent cells would die off when the rodents were injected with a drug. 

The work involved sophisticated genetic tinkering and extensive physiological testing, but the concept has an elegant simplicity to it. “We think these cells are bad when they accumulate. We remove them and see the consequences,” says Baker. “That’s how I try to explain it to my kids.”

Live long and prosper

Mice whose senescent cells were killed off over six months were healthier, in several ways, than a control group of transgenic mice in which these cells were allowed to build up. Their kidneys worked better and their hearts were more resilient to stress, they tended to explore their cages more and they developed cancers at a later age. Eliminating senescent cells also extended the lifespans of the mice by 20–30%, Baker and van Deursen report in Nature on 3 February1.

The research is a follow-up to a 2011 study, in which their team also found that eliminating senescent cells delayed the onset of diseases of old age in mice, although that work had been done in mice which had a mutation that causes premature ageing2.

In the hope of discovering therapies for diseases of old age, researchers are already looking for drugs that can directly eliminate senescent cells or stop them from churning out factors that damage neighbouring tissue. They include Baker and van Deursen, who have have licensed patents to develop such drugs to a company van Deursen has co-founded.  

The team's experiment “gives you confidence that senescent cells are an important target," says Dominic Withers, a clinician-scientist who studies ageing at Imperial College London and who co-wrote a News and Views article for Nature that accompanies the Mayo Clinic report3. “I think that there is every chance this will be a viable therapeutic option.”

Nature doi:10.1038/nature.2016.19287


Ageing: Out with the old.

Gil J, Withers DJ.

Nature. 2016 Feb 11;530(7589):164-5. doi: 10.1038/nature16875. No abstract available.

PMID: 26840486

Link to comment
Share on other sites

  • Replies 1.1k
  • Created
  • Last Reply

Top Posters In This Topic

Combined Use of the Rationalization of Home Medication by an Adjusted STOPP in Older Patients (RASP) List and a Pharmacist-Led Medication Review in Very Old Inpatients: Impact on Quality of Prescribing and Clinical Outcome.

Van der Linden L, Decoutere L, Walgraeve K, Milisen K, Flamaing J, Spriet I, Tournoy J.

Drugs Aging. 2016 Dec 3. [Epub ahead of print]

PMID: 27915457



Polypharmacy and potentially inappropriate drugs have been associated with negative outcomes in older adults which might be reduced by pharmacist interventions.


Our objective was to evaluate the effect of a pharmacist intervention, consisting of the application of the Rationalization of home medication by an Adjusted STOPP in older Patients (RASP) list and a pharmacist-led medication review on polypharmacy, the quality of prescribing, and clinical outcome in geriatric inpatients.


A monocentric, prospective controlled trial was undertaken at the geriatric wards of a large university hospital. Pharmacists applied the RASP list to the drugs reconciled on admission and additionally performed an expert-based medication review, upon which recommendations were provided to the treating physicians. The primary outcome was the composite endpoint of drug discontinuation and dose reduction of drugs taken on admission. Secondary outcomes included RASP-identified potentially inappropriate medications (PIMs), the number of Emergency Department (ED) visits and quality of life (QOL) registered up to 3 months after discharge.


On average, patients (n = 172) took 10 drugs on admission and were 84.5 years (standard deviation 4.8) of age. More drugs were discontinued or reduced in dose in the intervention group {control vs.


median (interquartile range [iQR]) 3 (2-5) vs. 5 (3-7); p < 0.001}. More PIMs were discontinued in the intervention group, leading to less PIM at discharge [control vs. INTERVENTION: median (IQR) 2 (1-3) vs. 0.5 (0-1); p < 0.001]. No signal of harm was seen, and a significant improvement of QOL and less ED visits without hospitalization were observed.


The combined intervention safely reduced drug use in very old inpatients and outperformed usual geriatric care. An increased QOL was seen, as well as a trend towards less ED visits. 


Cigarette Smoking and Mortality in Adults Aged 70 Years and Older: Results From the NIH-AARP Cohort.

Nash SH, Liao LM, Harris TB, Freedman ND.

Am J Prev Med. 2016 Nov 22. pii: S0749-3797(16)30517-7. doi: 10.1016/j.amepre.2016.09.036. [Epub ahead of print]

PMID: 27914770



Tobacco use remains a leading modifiable cause of cancer incidence and premature mortality in the U.S. and globally. Despite increasing life expectancy worldwide, less is known about the effects of cigarette smoking on older populations. This study sought to determine the effects of smoking on mortality in older age.


Associations of mortality with self-reported age at smoking cessation, age at smoking initiation, and amount smoked after age 70 years were examined in 160,113 participants of the NIH-AARP Diet and Health Study aged >70 years. Participants completed a questionnaire detailing their smoking use in 2004-2005, and were followed for mortality through December 31, 2011. Analyses were conducted between 2014 and 2016.


Relative to never smokers, current smokers were more likely to die during follow-up (hazard ratio, 3.18; 95% CI=3.04, 3.31). Furthermore, former smokers had lower risks than current smokers (hazard ratios for quitting between ages 30-39, 40-49, 50-59, and 60-69 years were 0.41 [95% CI=0.39, 0.43], 0.51 [95% CI=0.49, 0.54], 0.64 [95% CI=0.61, 0.67], and 0.77 [95% CI=0.73, 0.81], respectively). Among current smokers, mortality was inversely associated with age at initiation, but directly associated with the number of cigarettes smoked per day at age >70 years.


As among younger people, lifetime cigarette smoking history is a key determinant of mortality after age 70 years.


[The below paper is pdf-availed.]

Benefits, pitfalls and risks of phytotherapy in clinical practice in otorhinolaryngology.

Laccourreye O, Werner A, Laccourreye L, Bonfils P.

Eur Ann Otorhinolaryngol Head Neck Dis. 2016 Nov 30. pii: S1879-7296(16)30191-0. doi: 10.1016/j.anorl.2016.11.001. [Epub ahead of print]

PMID: 27914909



To elucidate the benefits, pitfalls and risks of phytotherapy in the clinical practice of otorhinolaryngology.


The PubMed and Cochrane databases were searched using the following keywords: phytotherapy, phytomedicine, herbs, otology, rhinology, laryngology, otitis, rhinitis, laryngitis and otorhinolaryngology. Seventy-two articles (18 prospective randomized studies, 4 Cochrane analyses, 4 meta-analysis and 15 reviews of the literature) devoted to clinical studies were analyzed. Articles devoted to in vitro or animal studies, biochemical analyses or case reports (including fewer than 10 patients) and articles dealing with honey, aromatherapy or minerals were excluded.


Per os ginkgo biloba has no indications in tinnitus, presbycusis or anosmia following viral rhinitis. Traditional Asian medicine has no proven benefit in sudden deafness or laryngeal papillomatosis. Per os mistletoe extracts associated to conventional treatment for head and neck squamous cell carcinoma does not increase 5-year survival. Extracts of various herbs, notably echinacea, eucalyptus, petasites hybridus, pelargonium sidoides, rosemary, spirulina and thyme, show superiority over placebo for rhinosinusitis and allergic rhinitis, as does gingko biloba for selected vertigo. There have been encouraging preliminary results for intratumoral injection of mistletoe in head and neck carcinoma and acupoint herbal patching for allergic rhinitis. Herb intake should be screened for in case of certain unexplained symptoms such as epistaxis, headache or dizziness, or signs suggesting allergy. Phytotherapy should be interrupted ahead of surgery and/or chemotherapy.


Scientific proof of the benefit of phytotherapy in otorhinolaryngology remains to be established but, given its widespread use and the reported data, knowledge of this form of treatment needs to be developed.


Herbs; Otorhinolaryngology; Phytomedicine; Phytotherapy


Association between Dietary Isoflavones in Soy and Legumes and Endometrial Cancer: A Systematic Review and Meta-Analysis.

Zhong XS, Ge J, Chen SW, Xiong YQ, Ma SJ, Chen Q.

J Acad Nutr Diet. 2016 Nov 30. pii: S2212-2672(16)31203-5. doi: 10.1016/j.jand.2016.09.036. [Epub ahead of print]

PMID: 27914914



Epidemiologic studies have reported conflicting findings between soy- and legume-derived dietary isoflavones and risk of endometrial cancer.


The aim of the present meta-analysis was to quantitatively investigate the association between daily intake of soy- and legume-derived isoflavones and risk of endometrial cancer.


A broad search was conducted in the following electronic databases: PubMed, EMBASE, Google Scholar, the Cochrane Library, the China Knowledge Resource Integrated Database, and the Chinese Biomedical Database based on combinations of the key words endometrial cancer, isoflavone, soy, and legume for epidemiologic studies that focused on relationships between dietary isoflavones and endometrial cancer risk. A fixed-effect or random-effect model was used to pool study-specific risk estimates.


A total of 13 epidemiologic studies were included in the present meta-analysis, consisting of three prospective cohort studies and 10 population-based case-control studies. The final results indicated that higher dietary isoflavone levels from soy products and legumes were associated with a reduced risk of endometrial cancer (odds ratio [OR] 0.81, 95% CI 0.74 to 0.89). Low heterogeneous bias was observed (I2=11.7%; P=0.327). Subgroup analyses were conducted based on study design, source of dietary isoflavones, and study region. When restricted to study design, dietary isoflavones from soy and legumes played a role in prevention of endometrial cancer in case-control studies (OR 0.81, 95% CI 0.73 to 0.90). However, there did not appear to be an association between dietary isoflavones and endometrial cancer in cohort studies (OR 0.81, 95% CI 0.66 to 1.00). Significant associations were found between dietary isoflavones from soy products (OR 0.82, 95% CI 0.72 to 0.92) and legumes (OR 0.84, 95% CI 0.74 to 0.96) and endometrial cancer. Dietary isoflavones were associated with reduced incidence of endometrial cancer, both in Asian countries (OR 0.78, 95% CI 0.66 to 0.93) and non-Asian countries (OR 0.82, 95% CI 0.73 to 0.92).


The findings suggest a weak inverse association between higher consumption of dietary isoflavones from soy products and legumes and endometrial cancer risk. However, there is still a need for large, prospective epidemiologic studies that provide a higher level of evidence to verify these findings.


Endometrial cancer; Isoflavones; Legume; Meta-analysis; Soy


Association of Industry Sponsorship With Outcomes of Nutrition Studies: A Systematic Review and Meta-analysis.

Chartres N, Fabbri A, Bero LA.

JAMA Intern Med. 2016 Oct 31. doi: 10.1001/jamainternmed.2016.6721. [Epub ahead of print]

PMID: 27802480

This systematic review and meta-analysis examines whether food industry sponsorship of nutrition studies is associated with outcomes that favor the sponsor.

Key Points

Question  Is food industry sponsorship of nutrition studies associated with outcomes that favor the sponsor?

Findings  This systematic review and meta-analysis examined 12 reports and found that 8, which included 340 studies, could be combined in a meta-analysis. Although industry-sponsored studies were more likely to have conclusions favorable to industry than non–industry-sponsored studies, the difference was not significant. There was also insufficient evidence to assess the quantitative effect of industry sponsorship on the results and quality of nutrition research.

Meaning  These findings suggest but do not establish that industry sponsorship of nutrition studies is associated with conclusions that favor the sponsors, and further investigation of differences in study results and quality is needed.



Food industry sponsorship of nutrition research may bias research reports, systematic reviews, and dietary guidelines.


To determine whether food industry sponsorship is associated with effect sizes, statistical significance of results, and conclusions of nutrition studies with findings that are favorable to the sponsor and, secondarily, to determine whether nutrition studies differ in their methodological quality depending on whether they are industry sponsored.


OVID MEDLINE, PubMed, Web of Science, and Scopus from inception until October 2015; the reference lists of included reports.


Reports that evaluated primary research studies or reviews and that quantitatively compared food industry-sponsored studies with those that had no or other sources of sponsorship.


Two reviewers independently extracted data from each report and rated its quality using the ratings of the Oxford Centre for Evidence-Based Medicine, ranging from a highest quality rating of 1 to a lowest of 5.


Results (statistical significance and effect size) favorable to the sponsor and conclusions favorable to the sponsor. If data were appropriate for meta-analysis, we used an inverse variance DerSimonian-Laird random-effects model.


Of 775 reports reviewed, 12, with quality ratings ranging from 1 to 4, met the inclusion criteria. Two reports, with data that could not be combined, assessed the association of food industry sponsorship and the statistical significance of research results; neither found an association. One report examined effect sizes and found that studies sponsored by the food industry reported significantly smaller harmful effects for the association of soft drink consumption with energy intake and body weight than those not sponsored by the food industry. Eight reports, including 340 studies, assessed the association of industry sponsorship with authors' conclusions. Although industry-sponsored studies were more likely to have favorable conclusions than non-industry-sponsored studies, the difference was not significant (risk ratio, 1.31 [95% CI, 0.99-1.72]). Five reports assessed methodological quality; none found an association with industry sponsorship.


Although industry-sponsored studies were more likely to have conclusions favorable to industry than non-industry-sponsored studies, the difference was not significant. There was also insufficient evidence to assess the quantitative effect of industry sponsorship on the results and quality of nutrition research. These findings suggest but do not establish that industry sponsorship of nutrition studies is associated with conclusions that favor the sponsors, and further investigation of differences in study results and quality is needed.


Receipt of Antibiotics in Hospitalized Patients and Risk for Clostridium difficile Infection in Subsequent Patients Who Occupy the Same Bed.

Freedberg DE, Salmasian H, Cohen B, Abrams JA, Larson EL.

JAMA Intern Med. 2016 Oct 10. doi: 10.1001/jamainternmed.2016.6193. [Epub ahead of print]

PMID: 27723860

This cohort study reports on the association of receipt of antibiotics by prior hospital bed occupants with increased risk for Clostridium difficile infection in patients subsequently occupying the same bed.

Key Points

Question  Is the receipt of antibiotics by prior hospital bed occupants associated with risk for Clostridium difficile infection (CDI) in subsequent patients who occupy the same bed?

Findings  In this cohort study, receipt of antibiotics by prior patients was associated with a 22% relative increase in risk for CDI in subsequent patients who occupied the same bed. Aside from antibiotics, no other factors related to the prior bed occupants were associated with increased risk for CDI in subsequent patients.

Meaning  Antibiotics given to one patient may alter the local microenvironment to influence a different patient’s risk for CDI.

Importance  Antibiotics are the crucial risk factor for CDI, but it is unknown how one hospitalized patient’s receipt of antibiotics may affect risk for CDI for a different patient within the same environment.



To assess whether receipt of antibiotics by prior hospital bed occupants is associated with increased risk for CDI in subsequent patients who occupy the same bed.


This is a retrospective cohort study of adult patients hospitalized in any 1 of 4 facilities between 2010 and 2015. Patients were excluded if they had recent CDI, developed CDI within 48 hours of admission, had inadequate follow-up time, or if their prior bed occupant was in the bed for less than 24 hours.


The primary exposure was receipt of non-CDI antibiotics by the prior bed occupant and the primary outcome was incident CDI in the subsequent patient to occupy the same bed. Incident CDI was defined as a positive result from a stool polymerase chain reaction for the C difficile toxin B gene followed by treatment for CDI. Demographics, comorbidities, laboratory data, and medication exposures are reported.


Among 100 615 pairs of patients who sequentially occupied a given hospital bed, there were 576 pairs (0.57%) in which subsequent patients developed CDI. Receipt of antibiotics in prior patients was significantly associated with incident CDI in subsequent patients (log-rank P < .01). This relationship remained unchanged after adjusting for factors known to influence risk for CDI including receipt of antibiotics by the subsequent patient (adjusted hazard ratio [aHR], 1.22; 95% CI, 1.02-1.45) and also after excluding 1497 patient pairs among whom the prior patients developed CDI (aHR, 1.20; 95% CI, 1.01-1.43). Aside from antibiotics, no other factors related to the prior bed occupants were associated with increased risk for CDI in subsequent patients.


Receipt of antibiotics by prior bed occupants was associated with increased risk for CDI in subsequent patients. Antibiotics can directly affect risk for CDI in patients who do not themselves receive antibiotics.


Nonfasting Mild-to-Moderate Hypertriglyceridemia and Risk of Acute Pancreatitis.

Pedersen SB, Langsted A, Nordestgaard BG.

JAMA Intern Med. 2016 Nov 7. doi: 10.1001/jamainternmed.2016.6875. [Epub ahead of print]

PMID: 27820614

This cohort study tests the hypothesis that nonfasting mild-to-moderate hypertriglyceridemia is associated with acute pancreatitis.

Key Points

Question  Is nonfasting mild-to-moderate hypertriglyceridemia associated with acute pancreatitis?

Findings  In 116 550 individuals from the general population, nonfasting mild-to-moderate hypertriglyceridemia of 177 mg/dL (2 mmol/L) or higher was associated with high risk of acute pancreatitis, with risk estimates higher than for myocardial infarction.

Meaning  Mild-to-moderate hypertriglyceridemia is associated with increased risk not only for myocardial infarction but also for acute pancreatitis.



Severe hypertriglyceridemia is associated with increased risk of acute pancreatitis. However, the threshold above which triglycerides are associated with acute pancreatitis is unclear.


To test the hypothesis that nonfasting mild-to-moderate hypertriglyceridemia (177-885 mg/dL; 2-10 mmol/L) is also associated with acute pancreatitis.


This prospective cohort study examines individuals from the Copenhagen General Population Study in 2003 to 2015 and the Copenhagen City Heart Study initiated in 1976 to 1978 with follow-up examinations in 1981 to1983, 1991 to 1994, and in 2001 to 2003. Median follow-up was 6.7 years (interquartile range, 4.0-9.4 years); and includes 116 550 individuals with a triglyceride measurement from the Copenhagen General Population Study (n = 98 649) and the Copenhagen City Heart Study (n = 17 901). All individuals were followed until the occurrence of an event, death, emigration, or end of follow-up (November 2014), whichever came first.


Plasma levels of nonfasting triglycerides.


Hazard ratios (HRs) for acute pancreatitis (n = 434) and myocardial infarction (n = 3942).


Overall, 116 550 individuals were included in this study (median [interquartile range] age, 57 [47-66] years). Compared with individuals with plasma triglyceride levels less than 89 mg/dL (<1 mmol/L), the multivariable adjusted HRs for acute pancreatitis were 1.6 (95% CI, 1.0-2.6; 4.3 events/10 000 person-years) for individuals with triglyceride levels of 89 mg/dL to 176 mg/dL (1.00 mmol/L-1.99 mmol/L), 2.3 (95% CI, 1.3-4.0; 5.5 events/10 000 person-years) for 177 mg/dL to 265 mg/dL (2.00 mmol/L-2.99 mmol/L), 2.9 (95% CI, 1.4-5.9; 6.3 events/10 000 person-years) for 366 mg/dL to 353 mg/dL (3.00 mmol/L-3.99 mmol/L), 3.9 (95% CI, 1.5-10.0; 7.5 events/10 000 person-years) for 354 mg/dL-442 mg/dL (4.00 mmol/L-4.99 mmol/L), and 8.7 (95% CI, 3.7-20.0; 12 events/10 000 person-years) for individuals with triglyceride levels greater than or equal to 443 mg/dL (≥5.00 mmol/L) (trend, P = 6 × 10-8). Corresponding HRs for myocardial infarction were 1.6 (95% CI, 1.4-1.9; 41 events/10 000 person-years), 2.2 (95% CI, 1.9-2.7; 57 events/10 000 person-years), 3.2 (95% CI, 2.6-4.1; 72 events/10 000 person-years), 2.8 (95% CI, 2.0-3.9; 68 events/10 000 person-years), and 3.4 (95% CI, 2.4-4.7; 78 events/10 000 person-years) (trend, P = 6 × 10-31), respectively. The multivariable adjusted HR for acute pancreatitis was 1.17 (95% CI, 1.10-1.24) per 89 mg/dL (1 mmol/L) higher triglycerides. When stratified by sex, age, education, smoking, hypertension, statin use, study cohort, diabetes, body mass index (calculated as weight in kilograms divided by height in meters squared), alcohol intake, and gallstone disease, these results were similar with no statistical evidence of interaction.


Nonfasting mild-to-moderate hypertriglyceridemia from 177 mg/dL (2 mmol/L) and above is associated with high risk of acute pancreatitis, with HR estimates higher than for myocardial infarction.


The Medicalization of Common Conditions.

Redberg RF.

JAMA Intern Med. 2016 Oct 3. doi: 10.1001/jamainternmed.2016.6210. [Epub ahead of print] No abstract available.

PMID: 27695854

Editor's Note

When JAMA Internal Medicine launched the Less Is More

series 6 years ago, we commented that one area of concern

was “medicalization” of common conditions.1 In this issue,

Shahraz et al1 elegantly demonstrate how common conditions

can be “medicalized.” Using NHANES data they find that

a widely promoted web-based risk test would label more than

73 million Americans, including more than 80% of those older

than 60 years, as being at high risk for “prediabetes,” a condition

never heard of 10 years ago.2 We suggest a better approach

to preventing the epidemic of obesity and its multiple

health-related complications is emphasis on healthful diet,

weight loss when appropriate, and increased physical activity

at all levels—by schools, the medical profession, and

public health and governmental agencies.

Rita F. Redberg, MD, MSc

1. Grady D, Redberg RF. Less is more: how less health care can result in better

health. Arch Intern Med. 2010;170(9):749-750. doi:10.1001/archinternmed


2. Prediabetes Risk in Adult Americans According to a Risk Test.

Shahraz S, Pittas AG, Kent DM.

JAMA Intern Med. 2016 Oct 3. doi: 10.1001/jamainternmed.2016.5919. [Epub ahead of print] No abstract available.

PMID: 27695825

Using data from the 2013-2014 NHANES population, this study provides an estimate of the proportion of nondiabetic adults who would be classified as being at high risk for prediabetes according to a web-based risk test.

The Diabetes Prevention Program and other studies found that individuals with impaired glucose tolerance (based on a 75-g oral glucose tolerance test) can decrease their risk of type 2 diabetes developing either by an intensive supervised lifestyle intervention, including diet and exercise modification, or by metformin hydrochloride treatment.1,2 Subsequently, the glycemic criteria for prediabetes were expanded to include hemoglobin A1c and a decreased level for fasting glucose.3 Although the benefit of type 2 diabetes prevention is unclear in this broader group, the Centers for Disease Control and Prevention, American Diabetes Association, and American Medical Association have promoted a web-based risk test to evaluate people at high risk for prediabetes for whom they recommend practice-based laboratory testing.4 We estimated the proportion of the adult, nondiabetic US population that would be classified as being at high risk for prediabetes according to this widely endorsed risk instrument.


Amino Acid Prompts Action

Posted on Dec. 5, 2016, 6 a.m. in Brain and Mental Performance Amino Acids

Supplementation of GABA (gamma aminobutyric acid) may enhance a person’s ability to react to and prioritize actions.

 Amino Acid Prompts Action

GABA neurotransmitter - image from Shutterstock

GABA (gamma aminobutyric acid) is an amino acid that is the most prevalent inhibitory neurotransmitter in the central nervous system. GABA is found in large amounts in the hypothalamus, thus implying that it has a fundamental role in hypothalamic-pituitary function, and thus neuroendocrine metabolism.   Laura Steenbergen, from Leiden University (The Netherlands), and colleagues enrolled 30 healthy men and women in a study to assess the role of GABA in planning and controlling different actions.  Subjects received a dietary supplement of GABA (800 mg), or placebo; each then performed a reaction-time test.  The researchers observed that the subjects supplemented with GABA displayed an increase in action selection response time for both signal modes tested. The study authors write that: "These findings, involving the systemic administration of synthetic GABA, provide the first evidence for a possible causal role of the GABA-ergic system in modulating performance in action cascading.”


γ-Aminobutyric acid (GABA) administration improves action selection processes: a randomised controlled trial.

Steenbergen L, Sellaro R, Stock AK, Beste C, Colzato LS.

Sci Rep. 2015 Jul 31;5:12770. doi: 10.1038/srep12770.

PMID: 26227783 Free PMC Article


In order to accomplish a task goal, real-life environments require us to develop different action control strategies in order to rapidly react to fast-moving visual and auditory stimuli. When engaging in complex scenarios, it is essential to prioritise and cascade different actions. Recent studies have pointed to an important role of the gamma-aminobutyric acid (GABA)-ergic system in the neuromodulation of action cascading. In this study we assessed the specific causal role of the GABA-ergic system in modulating the efficiency of action cascading by administering 800 mg of synthetic GABA or 800 mg oral of microcrystalline cellulose (placebo). In a double-blind, randomised, between-group design, 30 healthy adults performed a stop-change paradigm. Results showed that the administration of GABA, compared to placebo, increased action selection when an interruption (stop) and a change towards an alternative response were required simultaneously, and when such a change had to occur after the completion of the stop process. These findings, involving the systemic administration of synthetic GABA, provide the first evidence for a possible causal role of the GABA-ergic system in modulating performance in action cascading.


Peripheral Arterial Disease and Its Association With Arsenic Exposure and Metabolism in the Strong Heart Study.

Newman JD, Navas-Acien A, Kuo CC, Guallar E, Howard BV, Fabsitz RR, Devereux RB, Umans JG, Francesconi KA, Goessler W, Best LT, Tellez-Plaza M.

Am J Epidemiol. 2016 Nov 3. [Epub ahead of print]

PMID: 27810857


At high levels, inorganic arsenic exposure is linked to peripheral arterial disease (PAD) and cardiovascular disease. To our knowledge, no prior study has evaluated the association between low-to-moderate arsenic exposure and incident PAD by ankle brachial index (ABI). We evaluated this relationship in the Strong Heart Study, a large population-based cohort study of American Indian communities. A total of 2,977 and 2,966 PAD-free participants who were aged 45-74 years in 1989-1991 were reexamined in 1993-1995 and 1997-1999, respectively, for incident PAD defined as either ABI <0.9 or ABI >1.4. A total of 286 and 206 incident PAD cases were identified for ABI <0.9 and ABI >1.4, respectively. The sum of inorganic and methylated urinary arsenic species (∑As) at baseline was used as a biomarker of long-term exposure. Comparing the highest tertile of ∑As with the lowest, the adjusted hazard ratios were 0.57 (95% confidence interval (CI): 0.32, 1.01) for ABI <0.9 and 2.24 (95% CI: 1.01, 4.32) for ABI >1.4. Increased arsenic methylation (as percent dimethylarsinate) was associated with a 2-fold increased risk of ABI >1.4 (hazard ratio = 2.04, 95% CI: 1.02, 3.41). Long-term low-to-moderate ∑As and increased arsenic methylation were associated with ABI >1.4 but not with ABI <0.9. Further studies are needed to clarify whether diabetes and enhanced arsenic metabolism increase susceptibility to the vasculotoxic effects of arsenic exposure.


arsenic; metabolism; peripheral vascular disease

Edited by AlPater
Link to comment
Share on other sites

Pre-Sleep Protein Ingestion to Improve the Skeletal Muscle Adaptive Response to Exercise Training.

Trommelen J, van Loon LJ.

Nutrients. 2016 Nov 28;8(12). pii: E763. Review.

PMID: 27916799


Protein ingestion following resistance-type exercise stimulates muscle protein synthesis rates, and enhances the skeletal muscle adaptive response to prolonged resistance-type exercise training. As the adaptive response to a single bout of resistance exercise extends well beyond the first couple of hours of post-exercise recovery, recent studies have begun to investigate the impact of the timing and distribution of protein ingestion during more prolonged recovery periods. Recent work has shown that overnight muscle protein synthesis rates are restricted by the level of amino acid availability. Protein ingested prior to sleep is effectively digested and absorbed, and thereby stimulates muscle protein synthesis rates during overnight recovery. When applied during a prolonged period of resistance-type exercise training, protein supplementation prior to sleep can further augment gains in muscle mass and strength. Recent studies investigating the impact of pre-sleep protein ingestion suggest that at least 40 g of protein is required to display a robust increase in muscle protein synthesis rates throughout overnight sleep. Furthermore, prior exercise allows more of the pre-sleep protein-derived amino acids to be utilized for de novo muscle protein synthesis during sleep. In short, pre-sleep protein ingestion represents an effective dietary strategy to improve overnight muscle protein synthesis, thereby improving the skeletal muscle adaptive response to exercise training.


casein; exercise; hypertrophy; recovery; sleep


Nut consumption and risk of cardiovascular disease, total cancer, all-cause and cause-specific mortality: a systematic review and dose-response meta-analysis of prospective studies.

Aune D, Keum N, Giovannucci E, Fadnes LT, Boffetta P, Greenwood DC, Tonstad S, Vatten LJ, Riboli E, Norat T.

BMC Med. 2016 Dec 5;14(1):207.

PMID: 27916000



Although nut consumption has been associated with a reduced risk of cardiovascular disease and all-cause mortality, data on less common causes of death has not been systematically assessed. Previous reviews missed several studies and additional studies have since been published. We therefore conducted a systematic review and meta-analysis of nut consumption and risk of cardiovascular disease, total cancer, and all-cause and cause-specific mortality.


PubMed and Embase were searched for prospective studies of nut consumption and risk of cardiovascular disease, total cancer, and all-cause and cause-specific mortality in adult populations published up to July 19, 2016. Summary relative risks (RRs) and 95% confidence intervals (CIs) were calculated using random-effects models. The burden of mortality attributable to low nut consumption was calculated for selected regions.


Twenty studies (29 publications) were included in the meta-analysis. The summary RRs per 28 grams/day increase in nut intake was for coronary heart disease, 0.71 (95% CI: 0.63-0.80, I2 = 47%, n = 11), stroke, 0.93 (95% CI: 0.83-1.05, I2 = 14%, n = 11), cardiovascular disease, 0.79 (95% CI: 0.70-0.88, I2 = 60%, n = 12), total cancer, 0.85 (95% CI: 0.76-0.94, I2 = 42%, n = 8), all-cause mortality, 0.78 (95% CI: 0.72-0.84, I2 = 66%, n = 15), and for mortality from respiratory disease, 0.48 (95% CI: 0.26-0.89, I2 = 61%, n = 3), diabetes, 0.61 (95% CI: 0.43-0.88, I2 = 0%, n = 4), neurodegenerative disease, 0.65 (95% CI: 0.40-1.08, I2 = 5.9%, n = 3), infectious disease, 0.25 (95% CI: 0.07-0.85, I2 = 54%, n = 2), and kidney disease, 0.27 (95% CI: 0.04-1.91, I2 = 61%, n = 2). The results were similar for tree nuts and peanuts. If the associations are causal, an estimated 4.4 million premature deaths in the America, Europe, Southeast Asia, and Western Pacific would be attributable to a nut intake below 20 grams per day in 2013.


Higher nut intake is associated with reduced risk of cardiovascular disease, total cancer and all-cause mortality, and mortality from respiratory disease, diabetes, and infections.


All-cause mortality; Cancer; Cardiovascular disease; Cause-specific mortality; Meta-analysis; Nuts; Peanuts


Vitamin C intake modify the impact of dietary nitrite on the incidence of type 2 diabetes: A 6-year follow-up in Tehran Lipid and Glucose Study.

Bahadoran Z, Mirmiran P, Ghasemi A, Carlström M, Azizi F, Hadaegh F.

Nitric Oxide. 2016 Dec 1. pii: S1089-8603(16)30163-X. doi: 10.1016/j.niox.2016.11.005. [Epub ahead of print]

PMID: 27916563



There is no epidemiological study on the association between dietary nitrate (NO3) and nitrite (NO2) and intakes and the risk of type 2 diabetes (T2D).


The aim of this study was therefore to examine the potential effect of dietary NO3 and NO2 on the occurrence of T2D.


This longitudinal study was conducted within the framework of the Tehran Lipid and Glucose Study (TLGS) on 2139 T2D-free adults, aged 20-70 years, followed for a median of 5.8 y. Dietary intakes of NO3 and NO2 were estimated using a 168-food items validate semi-quantitative food frequency questionnaire, at baseline. Multivariate Hazard Ratios (HR) and 95% confidence intervals (CI), adjusted for diabetes risk score (DRS), and dietary intakes of fat, fiber and vitamin C, were calculated for residual energy-adjusted NO3 and NO2 intakes. Since significant interaction (P = 0.024) was found between NO2 and vitamin C intakes in the multivariable model, stratified analyses were done for < and ≥ median vitamin C intakes.


Median (inter quartile range; IQR) daily intake of NO3 and NO2 were 410 mg/d (343-499) and 8.77 mg/d (7.53-10.2). An increased risk of T2D was observed among participants who had higher intake of total and animal-based NO2 in participants who had low vitamin C intake (HR = 2.43, 95% CI = 1.45-4.05, HR = 1.88, 95% CI = 1.12-3.15, respectively). We found no significant association between NO3 in overall, and plant- and animal sources as well, with the risk of T2D. Plant-derived NO2 was also unrelated to incidence of T2D.


Our findings indicated that higher intakes of total and animal-based NO2 may be an independent dietary risk factor for development of T2D in subjects with lower vitamin C intakes.


Nitrate; Nitrite; Type 2 diabetes; Vitamin C


The Preventive Effect of Sustained Physical Activity on Incident Nonalcoholic Fatty Liver Disease.

Kwak MS, Kim D, Chung GE, Kim W, Kim JS.

Liver Int. 2016 Dec 5. doi: 10.1111/liv.13332. [Epub ahead of print]

PMID: 27917585



Physical activity (PA) is inversely associated with nonalcoholic fatty liver disease (NAFLD) prevalence. However, few studies evaluated the effect of PA on NAFLD incidence in regard to visceral adipose tissue (VAT) and insulin resistance (IR). We investigated whether PA at baseline and change in PA during follow-up have any effect on incident NAFLD.


We enrolled subjects who underwent health screenings between 2007 and 2008 and participated in voluntary follow-up between 2011 and 2013 (median 4.42 years). Incident NAFLD was defined as NAFLD absence at baseline and presence at follow-up by ultrasonography. PA was measured using a detailed questionnaire-based metabolic equivalent at baseline and follow-up; the difference during follow-up was calculated.


Of the 1,373 subjects enrolled, 288 (21.0%) developed NAFLD. Both total and leisure-time PA at baseline were inversely associated with incident NAFLD (p for trend=0.005 and 0.003, respectively). Decreased PA at follow-up was associated with increased incident NAFLD risk after adjusting for age, gender, body mass index, smoking, hypertension, diabetes, and diet [hazard ratio(HR) 1.45, 95% confidence interval (CI) 1.04-2.02, 4th (most decreased PA) vs. 1st quartile (increased PA), p=0.028]. This relationship was attenuated but remained statistically significant after adjustment for VAT(HR 1.48, 95% CI 1.06-2.06, 4th vs. 1st quartile) and IR(HR 1.59, 95% CI 1.11-2.27, 4th vs. 1st quartile).


This study shows an independent protective effect of PA at baseline on incident NAFLD after 4-year follow-up. Furthermore, sustained or increased PA had a preventive effect on incident NAFLD independent of VAT and IR. This article is protected by copyright.


development; exercise; hepatic steatosis; leisure time


Urinary Incontinence in Older Women: The Role of Body Composition and Muscle Strength: From the Health, Aging, and Body Composition Study.

Suskind AM, Cawthon PM, Nakagawa S, Subak LL, Reinders I, Satterfield S, Cummings S, Cauley JA, Harris T, Huang AJ; Health ABC Study..

J Am Geriatr Soc. 2016 Dec 5. doi: 10.1111/jgs.14545. [Epub ahead of print]

PMID: 27918084



To evaluate prospective relationships between body composition and muscle strength with predominantly stress urinary incontinence (SUI) and urgency urinary incontinence (UUI) in older women.


Prospective community-dwelling observational cohort study (Health, Aging, and Body Composition study).


Women initially aged 70 to 79 recruited from Pittsburgh, Pennsylvania, and Memphis, Tennessee (N = 1,475).


Urinary incontinence was assessed using structured questionnaires. Body mass index (BMI), grip strength, quadriceps torque, and walking speed were assessed using physical examination and performance testing. Appendicular lean body mass (ALM) and whole-body fat mass were measured using dual-energy X-ray absorptiometry.


At baseline, 212 (14%) women reported at least monthly predominantly SUI and 233 (16%) at least monthly predominantly UUI. At 3 years, of 1,137 women, 164 (14%) had new or persistent SUI, and 320 (28%) had new or persistent UUI. Women had greater odds of new or persistent SUI if they demonstrated a 5% or greater decrease in grip strength, (adjusted odds ratio (AOR) = 1.60, P = .047) and lower odds of new or persistent SUI if they demonstrated a 5% or greater decrease in BMI (AOR = 0.46, P = .01), a 5% or greater increase in ALM corrected for BMI (AOR = 0.17, P = .004), or a 5% or greater decrease in fat mass (AOR = 0.53, P = .01). Only a 5% or greater increase in walking speed was associated with new or persistent UUI over 3 years (AOR = 1.54, P = .04).


In women aged 70 and older, changes in body composition and grip strength were associated with changes in SUI frequency over time. In contrast, changes in these factors did not influence UUI. Findings suggest that optimization of body composition and muscle strength is more likely to modify risk of SUI than of UUI in older women.


stress urinary incontinence; urgency urinary incontinence


High Serum Adiponectin Level Is a Risk Factor for Anemia in Japanese Men: A Prospective Observational Study of 1,029 Japanese Subjects.

Kohno K, Narimatsu H, Shiono Y, Suzuki I, Kato Y, Sho R, Otani K, Ishizawa K, Yamashita H, Kubota I, Ueno Y, Kato T, Fukao A, Kayama T.

PLoS One. 2016 Dec 5;11(12):e0165511. doi: 10.1371/journal.pone.0165511.

PMID: 27918575


Erythroid abnormalities including anemia and polycythemia are often observed in the general clinical setting. Because recent studies reported that adiponectin negatively affects hematopoiesis, we performed a prospective observational study to assess the relationship between anemia and adiponectin, as well as other parameters, in 1029 Japanese subjects (477 men and 552 women) 40 years of age and older. Body measurements, blood tests, and nutrition intake studies were performed at baseline, and 5 to 7 years later (follow-up). Hemoglobin (Hb) and hematocrit (Hct) levels in men with high serum adiponectin levels were lower at follow-up than at baseline. Multiple regression analysis showed that age, body mass index, adiponectin, and glutamic-pyruvic transaminase were significantly associated with erythroid-related variables (red blood cells, Hb, and Hct) in both men and women (P <0.05). In a logistic regression analysis, adiponectin, fasting blood glucose, and β-natriuretic peptide were significant risk factors for anemia in men, and blood urea nitrogen and amylase were significant risk factors in women. Physical features and nutrient intake were not risk factors for anemia. Our study demonstrates, both clinically and epidemiologically, that a high serum adiponectin level decreases the amounts of erythroid-related variables and is a risk factor for anemia in Japanese men.


New Protein Linked to Aging & Age-Related Diseases

Posted on Dec. 6, 2016, 6 a.m. in Age-related Macular Degeneration Genetic Research

The discovery of Tmem135 could lead to the development of new treatments for conditions causing sight loss in later life.

 New Protein Linked to Aging & Age-Related Diseases

Retina and macula - image from Shutterstock

Scientists have discovered a new protein that links aging with age-related retinal diseases. This could lead to new therapies for eyesight loss in older people. In an article published in the journal eLife, researchers from the University of Wisconsin-Madison studied lab mice and found a protein called Tmem135 (Transmembrane 135) which is responsible for retinal aging. They discovered that mutations of this protein resulted in age-related retinal disease.

Millions Suffer Age-Related Retinal Diseases

Ironically, previous studies show that Tmem135 is associated with fat storage and extended life in a type of roundworm called Caenorhabditis Elegans. The molecular function of protein Tmem135 in the worms has yet to be determined. In the new study, researchers showed that irregular levels of this protein are linked to symptoms of macular degeneration, a common age-dependent retinal disease.

About 11 million people in the United States are afflicted with AMD (Age-related macular degeneration) which affects central vision in both eyes. The condition worsens as time passes, making it more difficult to see things, and there is no cure for this disease. There are two types of AMD: wet and dry, with dry affecting 90% of those with AMD. There are no scientifically proven treatments available. But researchers in the study have linked Tmem135 or its defect as a target for new medical treatments for people suffering retinal conditions.

Tmem135 May Lead to Future Retinal Treatments

How the team discovered Tmem135 is interesting. They used existing mouse models that matched similar retinal abnormalities that are found in mice that have early onset of retinal disease. The lab mice had their genes mapped which revealed a mutation in the protein Tmem135 thought to be the cause of the retinal conditions. Another discovery by the team of researchers was that the protein had a role in regulating size in the mitochondria in cells, thus determining the pace of aging in the retina. Also, it was shown that Tmem135 is critical for protecting cells against mutations further slowing down retinal aging.

The protein Tmem135 can mutate leading to accelerated age-related aging in the retina of mice. Its other role as a regulator of the mitochondria of cells, confirms the two processes are molecularly related to each other. The study concluded that Tmem135 is a key protein that needs further exploration to be potentially used in future treatments for humans. The research team is now working to determine the exact biochemical function of protein Tmem135 in cell mitochondria. Further examination is also needed to determine the role it has in changing the aging process in various tissues and other age-related diseases.


Mouse <i>Tmem135</i> mutation reveals a mechanism involving mitochondrial dynamics that leads to age-dependent retinal pathologies.

Lee WH, Higuchi H, Ikeda S, Macke EL, Takimoto T, Pattnaik BR, Liu C, Chu LF, Siepka SM, Krentz KJ, Rubinstein CD, Kalejta RF, Thomson JA, Mullins RF, Takahashi JS, Pinto LH, Ikeda A.

Elife. 2016 Nov 15;5. pii: e19264. doi: 10.7554/eLife.19264.

PMID: 27863209 Free PMC Article


While the aging process is central to the pathogenesis of age-dependent diseases, it is poorly understood at the molecular level. We identified a mouse mutant with accelerated aging in the retina as well as pathologies observed in age-dependent retinal diseases, suggesting that the responsible gene regulates retinal aging, and its impairment results in age-dependent disease. We determined that a mutation in the transmembrane 135 (Tmem135) is responsible for these phenotypes. We observed localization of TMEM135 on mitochondria, and imbalance of mitochondrial fission and fusion in mutant Tmem135 as well as Tmem135 overexpressing cells, indicating that TMEM135 is involved in the regulation of mitochondrial dynamics. Additionally, mutant retina showed higher sensitivity to oxidative stress. These results suggest that the regulation of mitochondrial dynamics through TMEM135 is critical for protection from environmental stress and controlling the progression of retinal aging. Our study identified TMEM135 as a critical link between aging and age-dependent diseases.


ENU; age-dependent retinal diseases; aging; cell biology; mitochondrial dynamics; mouse; neuroscience; retina; retinal pigment epithelium


Stressed-induced TMEM135 protein is part of a conserved genetic network involved in fat storage and longevity regulation in Caenorhabditis elegans.

Exil VJ, Silva Avila D, Benedetto A, Exil EA, Adams MR, Au C, Aschner M.

PLoS One. 2010 Dec 3;5(12):e14228. doi: 10.1371/journal.pone.0014228.

PMID: 21151927 Free PMC Article


Disorders of mitochondrial fat metabolism lead to sudden death in infants and children. Although survival is possible, the underlying molecular mechanisms which enable this outcome have not yet been clearly identified. Here we describe a conserved genetic network linking disorders of mitochondrial fat metabolism in mice to mechanisms of fat storage and survival in Caenorhabditis elegans (C. elegans). We have previously documented a mouse model of mitochondrial very-long chain acyl-CoA dehydrogenase (VLCAD) deficiency. We originally reported that the mice survived birth, but, upon exposure to cold and fasting stresses, these mice developed cardiac dysfunction, which greatly reduced survival. We used cDNA microarrays to outline the induction of several markers of lipid metabolism in the heart at birth in surviving mice. We hypothesized that the induction of fat metabolism genes in the heart at birth is part of a regulatory feedback circuit that plays a critical role in survival. The present study uses a dual approach employing both C57BL/6 mice and the nematode, C. elegans, to focus on TMEM135, a conserved protein which we have found to be upregulated 4.3 (±0.14)-fold in VLCAD-deficient mice at birth. Our studies have demonstrated that TMEM135 is highly expressed in mitochondria and in fat-loaded tissues in the mouse. Further, when fasting and cold stresses were introduced to mice, we observed 3.25 (±0.03)- and 8.2 (±0.31)-fold increases in TMEM135 expression in the heart, respectively. Additionally, we found that deletion of the tmem135 orthologue in C. elegans caused a 41.8% (±2.8%) reduction in fat stores, a reduction in mitochondrial action potential and decreased longevity of the worm. In stark contrast, C. elegans transgenic animals overexpressing TMEM-135 exhibited increased longevity upon exposure to cold stress. Based on these results, we propose that TMEM135 integrates biological processes involving fat metabolism and energy expenditure in both the worm (invertebrates) and in mammalian organisms. The data obtained from our experiments suggest that TMEM135 is part of a regulatory circuit that plays a critical role in the survival of VLCAD-deficient mice and perhaps in other mitochondrial genetic defects of fat metabolism as well.

Link to comment
Share on other sites


Regenerative medicine

Vol. 540 No. 7632_supp ppS49-S91



Related articles

Our bodies aren’t forever: parts wear out, trauma breaks things and organs stop functioning. Sometimes, a drug can remedy a chemical imbalance or surgery can repair a structural failure, but there are times when there is no substitute for replacing a part with human tissue or even an entire organ. Rapid advances in the field of regenerative medicine are bringing that possibility closer to reality.

Free full access


Gut bacteria linked to Parkinson's

Nature 540, 172–173 (08 December 2016) doi:10.1038/540172d

Published online 07 December 2016

Subject terms: Microbiology Neurodegeneration

Bacteria living in the gut may contribute to movement problems seen in disorders such as Parkinson's disease.

Timothy Sampson and Sarkis Mazmanian at the California Institute of Technology in Pasadena and their team generated mice that lacked their own bacteria and had been genetically engineered so that their brains overproduce α-synuclein — a protein that forms clumps in the brains of people with Parkinson's. They found that these germ-free mice moved more freely and accumulated less α-synuclein in their brains than animals with gut microbes. When the team transplanted microbes from the faeces of people with Parkinson's disease into the guts of the mice, the animals showed more movement dysfunction than those that received bacteria from healthy humans.

The authors think that molecules made by gut microbes could activate certain immune cells and boost inflammation in general, which then enhances the clumping of α-synuclein in the brain.

Cell 167, 1469–1480 (2016)


Gut Microbiota Regulate Motor Deficits and Neuroinflammation in a Model of Parkinson's Disease.

Sampson TR, Debelius JW, Thron T, Janssen S, Shastri GG, Ilhan ZE, Challis C, Schretter CE, Rocha S, Gradinaru V, Chesselet MF, Keshavarzian A, Shannon KM, Krajmalnik-Brown R, Wittung-Stafshede P, Knight R, Mazmanian SK.

Cell. 2016 Dec 1;167(6):1469-1480.e12. doi: 10.1016/j.cell.2016.11.018.

PMID: 27912057


Gut microbes promote α-synuclein-mediated motor deficits and brain pathology

Depletion of gut bacteria reduces microglia activation

SCFAs modulate microglia and enhance PD pathophysiology

Human gut microbiota from PD patients induce enhanced motor dysfunction in mice


The intestinal microbiota influence neurodevelopment, modulate behavior, and contribute to neurological disorders. However, a functional link between gut bacteria and neurodegenerative diseases remains unexplored. Synucleinopathies are characterized by aggregation of the protein α-synuclein (αSyn), often resulting in motor dysfunction as exemplified by Parkinson’s disease (PD). Using mice that overexpress αSyn, we report herein that gut microbiota are required for motor deficits, microglia activation, and αSyn pathology. Antibiotic treatment ameliorates, while microbial re-colonization promotes, pathophysiology in adult animals, suggesting that postnatal signaling between the gut and the brain modulates disease. Indeed, oral administration of specific microbial metabolites to germ-free mice promotes neuroinflammation and motor symptoms. Remarkably, colonization of αSyn-overexpressing mice with microbiota from PD-affected patients enhances physical impairments compared to microbiota transplants from healthy human donors. These findings reveal that gut bacteria regulate movement disorders in mice and suggest that alterations in the human microbiome represent a risk factor for PD.



Renal cell carcinoma survival and body mass index: a dose-response meta-analysis reveals another potential paradox within a paradox.

Bagheri M, Speakman JR, Shemirani F, Djafarian K.

Int J Obes (Lond). 2016 Dec;40(12):1817-1822. doi: 10.1038/ijo.2016.171. Review.

PMID: 27686524



In healthy subjects increasing body mass index (BMI) leads to greater mortality from a range of causes. Following onset of specific diseases, however, the reverse is often found: called the 'obesity paradox'. But we recently observed the phenomenon called the 'paradox within the paradox' for stroke patients.


The objective of our study was to examine the effect of each unit increase in BMI on renal cancer-specific survival (CSS), cancer-specific mortality, overall survival (OS) and overall mortality.


Random-effects generalized least squares models for trend estimation were used to analyze the data. Eight studies, comprising of 8699 survivals of 10 512 renal cell carcinoma (RCC) patients met the inclusion criteria, including 5 on CSS and 3 on OS.


The association of BMI with CSS and OS was non-linear (P<0.0001, P=0.004, respectively). We observed that CSS increased in relation to BMI, indicating that there was the obesity paradox in RCC. However, each unit increase in BMI over 25 was associated with decreased OS, indicating that RCC may also exhibit a paradox within the paradox.


Inconsistent effects of increases in BMI on CSS and OS, as previously observed for stroke, creates a paradox (different directions of mortality for different causes) within the obesity paradox.


Perpetuating effects of androgen deficiency on insulin resistance.

Cameron JL, Jain R, Rais M, White AE, Beer TM, Kievit P, Winters-Stone K, Messaoudi I, Varlamov O.

Int J Obes (Lond). 2016 Dec;40(12):1856-1863. doi: 10.1038/ijo.2016.148.

PMID: 27534842



Androgen deprivation therapy (ADT) is commonly used for treatment of prostate cancer but is associated with side effects, such as sarcopenia and insulin resistance. The role of lifestyle factors such as diet and exercise on insulin sensitivity and body composition in testosterone-deficient males is poorly understood. The aim of the present study was to examine the relationships between androgen status, diet and insulin sensitivity.


Middle-aged (11-12 years old) intact and orchidectomized male rhesus macaques were maintained for 2 months on a standard chow diet and then exposed for 6 months to a Western-style, high-fat/calorie-dense diet (WSD) followed by 4 months of caloric restriction (CR). Body composition, insulin sensitivity, physical activity, serum cytokine levels and adipose biopsies were evaluated before and after each dietary intervention.


Both intact and orchidectomized animals gained similar proportions of body fat, developed visceral and subcutaneous adipocyte hypertrophy and became insulin resistant in response to the WSD. CR reduced body fat in both groups but reversed insulin resistance only in intact animals. Orchidectomized animals displayed progressive sarcopenia, which persisted after the switch to CR. Androgen deficiency was associated with increased levels of interleukin-6 and macrophage-derived chemokine (C-C motif chemokine ligand 22), both of which were elevated during CR. Physical activity levels showed a negative correlation with body fat and insulin sensitivity.


Androgen deficiency exacerbated the negative metabolic side effects of the WSD such that CR alone was not sufficient to improve altered insulin sensitivity, suggesting that ADT patients will require additional interventions to reverse insulin resistance and sarcopenia.


Will Fasting To Lose Weight Work For Me? Pros And Cons; What Intermittent Fasting Feels Like

Dec 5, 2016 06:17 PM 

By Quora Contributor

Big Sugar Has Been Hiding A Big Secret

This question originally appeared on Quora. Answer by Darren Beattie.

Is intermittent fasting (IF) a good idea for weight loss? It can help with dietary adherence in some folks and probably help them manage or at least learn about hunger sensations.

It’s not for everybody though, you should probably experiment with different variations of it and consider it after making some key basic nutritional adjustments. It doesn’t suit everyone’s lifestyle but it might suit yours.

It can only create fat loss the same way any other eating strategy does, via an energy deficit.

I think it's first necessary to recognize that the majority of us naturally fast 12-14 hours per day already during our sleep.

Then we need to establish some ground rules about Intermittent Fasting as there are numerous methods.

24 Hour Fasts (Once or Twice per Week)

Alternate Day Method (Eat Every Other Day)

16 Hour Fast Method (popularized by Martin Berkhan at Lean Gains )

20 Hour Fast Method (popularized by the Ori Hofmekler's Warrior Diet)

Religious Fasting (period fasting as related to religious practices)

Probably others…

IF has become all the rage recently because it has some good evidence pointing in the direction of fat loss in particular. However, it’s important to keep in mind that there is actually very little research in this field (no one wants to fund not eating).

It has never really been shown to be any more effective for fat loss (AKA weight loss) than simple caloric restriction when calories are accounted for directly (i.e. not personal recall, which is always inaccurate). Which makes perfect sense from a physiological stand point because you can’t defy the laws of physics just by eating only at a certain time.

When IF does seem more effective for fat loss it’s probably just because it creates a calorie restriction in certain people and frankly people who don’t have success with a strategy just don’t rave about it as much as people who do.

A colleague, Dr. John Berardi, wrote a great (and free) book on the subject a few years back. Based on his experience and did an little experiment on himself utilizing blood work. You can read it here: PN-IF

What you may note is looking at the health markers of this case study, not a lot actually changed for him (though he did get leaner), presumably because before he started fasting he already had a solid foundation of eating healthfully. Now this isn’t a formal study, so a lot of variables haven’t be accounted for.

In a great deal of nutrition research I’ve plowed through over the years inducing a change from a test subjects ‘normal’ (in this case taking someone out of their usual diet routine) diet in pretty much any way yields significant changes in various health markers. Though not necessarily weight (again an energy deficit needs to be present for weight loss or more specifically fat loss to occur).

What this case study suggests to me is that giving yourself a feeding window or a fasting window doesn’t really create much of a change outside of the existing diet. When people do get dramatic changes, it’s probably more related to the fact that doing IF caused a more dramatic shift in their eating and potentially training habits.

For instance, training is a significant part of the lean gains protocol.

Sound eating habits and a generally sound diet, utilizing foundational eating skills like these (disclaimer, my blog - Eating for Fat Loss - Skill Based Fitness ) will probably yield better health marker changes by comparison assuming that in doing IF you do not change any of your current habits of eating, you simply cut out food for an extended period of time. Of course the fat loss that occurs with a skill based approach is still the result of creating an energy deficit.

In my opinion developing these habits (and there are potentially others) is the first step or foundational step in eating as a generalized consideration (more veggies, better/more lean protein, enough healthy fats and carbs to match training/exercise requirements).

Meaning, if you plan to try IF (as I've read of many people doing unfortunately) with your current diet of simple carbohydrates, sweets, and other nutrient-free calories, you will probably not get the results you are looking for. Or you may get the results you are looking for but with a disregard for the possible health outcomes down the road.

I would warn that IF is not an excuse to eat whatever you want at re-feed times and that assuming it to be a miraculous cure for low body fat would be a mistake. Many people appear to treat it as such.

The real benefits of IF appear to be:

Individual adherence (i.e. an eating strategy that works for the individual)

Probably helps suppress/manage hunger and/or recognize hunger cues better

Meaning that if fasting helps you achieve an energy deficit and your goal is weight loss, then great, dietary adherence is probably the most crucial thing to consider in any eating strategy.

It also appears effective at teaching/managing/mitigating hunger sensations (at least in certain people). And well hunger usually accompanies energy restriction, so anything a person can do to manage hunger effectively can really help a person lose fat. If you’re better at recognizing when you’re really hungry, then you can probably manage a real energy restriction better.

Note that there are many other ways to manage hunger too, like increasing protein, so it’s really about finding something that works for you.

It might not be a great strategy if you:

Want to gain weight/muscle (though the lean gains approach certainly has some history here, so if you can create an energy surplus with a restricted feeding window by all means - some people use it to some success to gain muscle slower and reduce the amount of fat that generally comes with weight gain)

Have a history of disordered eating you might want to consider something else (along with getting the necessary help)

Fasting for weight loss fundamentally works like any other form of caloric restriction. For any evidence I've read showing positive changes in health markers we do not know for sure if these are not more directly related to the caloric restriction than anything else.

Caloric restriction has been shown to increase longevity and improve many health markers.

There is some other evidence I've read revealing that there are also possible downsides to certain health markers to go with the good changes. It’s a mixed bag and hard to determine.

As a coach who's played with a few types of IF it personally did not appeal to me as it may to you. I have used various types successfully with certain clients though. I often go through a 1–2 week experiment with clients to give them an appreciation for what real hunger is. I do not like the sensation of purposefully putting off eating as it didn't jive with my social life, my sports performance or when I genuinely do feel hungry (sometimes I would cut my planned fast short because of this...). For other people it works with their schedule, their training schedule, their lifestyle, etc…

I also have some general concerns about it's use with certain individuals in that it can create an odd relationship with food and may for instance encourage or lead to eating disorders but the research on that is also a bit of a mixed bag.

Too much caloric restriction can mean just as much malnutrition as it can longevity, and in my opinion could hinder your quality of life (even if it becomes longer).

Based on this, I generally consider Intermittent Fasting (IF), as more of an intermediate to advanced nutritional strategy for weight loss for most folks. I prefer to start more simply (with strategies listed above) and build up to something like this. It won't be any better than any diet solution for many, particularly without good nutrition already in place. It would still have to create an energy deficit for fat loss.

I would also say that IF is more appropriately used as a short-term strategy that helps already lean people, get more lean. What some people in the bodybuilding community refer to as ‘cutting.’ You know, you've been hovering just above 10% body fat, and you want to get below that so you can do a photo-shoot with your abs pulsating, that kind of thing. It just clicks with certain people too.

Ultimately, if you are doing the majority of those things especially well already, but still not getting where you’d like to then IF may be something worthwhile to experiment with. It has some merits, most of which I believe are more hunger/lifestyle related as opposed to weight-loss specific, but a lot of the time that’s all a person needs to get over the plateau.

I would encourage you to think of IF as just a tool (like in my opinion Calorie Counting, Nutrient Timing, or Carb Cycling is) you can use from time to time, once you've developed some quality eating habits. Simply doing IF on it's own with no consideration as to what you are ingesting would be a waste of time in my opinion.

The timing doesn’t matter as much as what that restricted feeding window ultimately does. If you overeat in 8 hours, it has the same effect as over eating in a 12 hour window = weight gain. The same goes for under eating = weight loss.

I would start with the Lean Gains  approach (pretty much skipping breakfast) as it's the least daunting of those listed and then work towards adding hours.

The Warrior Diet is a 20 hour fasting method, whereby you can consume small amounts of veggie shakes to reduce hunger during your fast (among many other dietary suggestions).

For info on the Warrior Diet or Method of IF, check out Ori Hofmekler's website at The Warrior Diet.

For the longer 24 hour fasting method(s) you can also check out the book, "Eat, Stop, Eat" by Brad Pilon.

Each has their own merit, but usually based around the adherence/hunger factors for people. I found going an entire day without food challenging, let alone doing it weekly, which would help me stick with it right? The more moderate methods worked more effectively for me personally, but I’m not you. So don’t do anything that you can’t/won’t stick with, it won’t help you in the long run.

Whatever you choose to do, I encourage you to experiment with it and gauge how it makes you feel and operate.

More from Quora:

Is it healthy to fast for one day per week?

Is it okay to work out in the morning and skip breakfast then go to work?

Can I do intermittent fasting 6 days a week?


[The below paper is pdf-availed.]

Telomere length is longer in women with late maternal age.

Fagan E, Sun F, Bae H, Elo I, Andersen SL, Lee J, Christensen K, Thyagarajan B, Sebastiani P, Perls T, Honig LS, Schupf N; Long Life Family Study..

Menopause. 2016 Dec 5. [Epub ahead of print]

PMID: 27922939



Maternal age at birth of last child has been associated with maternal longevity. The aim of this study was to determine whether older women with a history of late maternal age at last childbirth had a longer leukocyte telomere length than those with maternal age at last childbirth of 29 years or less.


A nested case control study was conducted using data from the Long Life Family Study. Three hundred eighty-seven women who gave birth to at least one child and lived to the top fifth percentile of their birth cohort, or died before the top fifth percentile of their birth cohort died, but were at least 70 years old, were studied. Logistic regression models using generalized estimating equations were used to determine the association between tertiles of telomere length and maternal age at last childbirth, adjusting for covariates.


Age at birth of the last child was significantly associated with leukocyte telomere length. Compared with women who gave birth to their last child before the age of 29, women who were past the age of 33 when they had their last child were two to three times more likely to have leukocyte telomere length in the second and third tertiles than in the first tertile.


These findings show an association between longer leukocyte telomere length and a later maternal age at birth of last child, suggesting that extended maternal age at last childbirth may be a marker for longevity.


Chronological Lifespan in Yeast Is Dependent on the Accumulation of Storage Carbohydrates Mediated by Yak1, Mck1 and Rim15 Kinases.

Cao L, Tang Y, Quan Z, Zhang Z, Oliver SG, Zhang N.

PLoS Genet. 2016 Dec 6;12(12):e1006458. doi: 10.1371/journal.pgen.1006458.

PMID: 27923067


Upon starvation for glucose or any other macronutrient, yeast cells exit from the mitotic cell cycle and acquire a set of characteristics that are specific to quiescent cells to ensure longevity. Little is known about the molecular determinants that orchestrate quiescence entry and lifespan extension. Using starvation-specific gene reporters, we screened a subset of the yeast deletion library representing the genes encoding 'signaling' proteins. Apart from the previously characterised Rim15, Mck1 and Yak1 kinases, the SNF1/AMPK complex, the cell wall integrity pathway and a number of cell cycle regulators were shown to be necessary for proper quiescence establishment and for extension of chronological lifespan (CLS), suggesting that entry into quiescence requires the integration of starvation signals transmitted via multiple signaling pathways. The CLS of these signaling mutants, and those of the single, double and triple mutants of RIM15, YAK1 and MCK1 correlates well with the amount of storage carbohydrates but poorly with transition-phase cell cycle status. Combined removal of the glycogen and trehalose biosynthetic genes, especially GSY2 and TPS1, nearly abolishes the accumulation of storage carbohydrates and severely reduces CLS. Concurrent overexpression of GSY2 and TSL1 or supplementation of trehalose to the growth medium ameliorates the severe CLS defects displayed by the signaling mutants (rim15Δyak1Δ or rim15Δmck1Δ). Furthermore, we reveal that the levels of intracellular reactive oxygen species are cooperatively controlled by Yak1, Rim15 and Mck1, and the three kinases mediate the TOR1-regulated accumulation of storage carbohydrates and CLS extension. Our data support the hypothesis that metabolic reprogramming to accumulate energy stores and the activation of anti-oxidant defence systems are coordinated by Yak1, Rim15 and Mck1 kinases to ensure quiescence entry and lifespan extension in yeast.


Should blood pressure goal be individualized in hypertensive patients?

Yannoutsos A, Kheder-Elfekih R, Halimi JM, Safar ME, Blacher J.

Pharmacol Res. 2016 Dec 2. pii: S1043-6618(16)30588-6. doi: 10.1016/j.phrs.2016.11.037. [Epub ahead of print] Review.

PMID: 27919826


The aim of the present review is to consider the clinical relevance of individualized blood pressure (BP) goal under treatment in hypertensive patients according to their age, comorbidities or established cardiovascular (CV) disease. Evidence from large-scale randomized trials to support a lower BP goal, as initially recommended by guidelines in high-risk hypertensive patients, were lacking. Recently, the randomized intervention SPRINT trial studied two treatment targets for systolic BP (120mm Hg versus 140mm Hg in the intensive and standard treatment group, respectively) among high-risk hypertensive patients, without diabetes and without a history of prior stroke. The trial was stopped prematurely owing to a significantly lower rate of the primary composite outcome and all-cause mortality in the intensive treatment group. Several practical questions have to be considered. First, using an automated measurement system at an office visit during the SPRINT protocol, while the patient was seated alone after 5minutes of quiet rest, may likely have resulted in lower BP values than would normally be obtained with the routine BP measurement. A target systolic of 120mm Hg in SRPINT trial may be thus equated to a target systolic BP of 130mm Hg in the real-world office setting. Second, careful and repeated examinations of SPRINT participants may have led to fewer adverse events (more frequent in the intensive treatment group) than that expected in the real-world setting. The safety profile of this intensive treatment approach should therefore remain a matter of concern in clinical practice, especially in elderly patients, in diabetic patients or with established CV or renal disease. Orthostatic hypotension should alert the clinician to withhold up titration. Third, beyond the question of BP goal, choice of antihypertensive medication and effective 24-hour BP control are important to consider in the context of BP-lowering strategy. In particular, ambulatory BP measurements and during nighttime should be considered for an individualized hypertension care.


J-curve phenomenon; blood pressure goal; cardiovascular disease; diabetes; hypertension; orthostatic hypotension


HDL function is impaired in acute myocardial infarction independent of plasma HDL cholesterol levels.

Annema W, Willemsen HM, de Boer JF, Dikkers A, van der Giet M, Nieuwland W, Muller Kobold AC, van Pelt LJ, Slart RH, van der Horst IC, Dullaart RP, Tio RA, Tietge UJ.

J Clin Lipidol. 2016 Nov - Dec;10(6):1318-1328. doi: 10.1016/j.jacl.2016.08.003.

PMID: 27919348



High-density lipoproteins (HDLs) protect against the development of atherosclerotic cardiovascular disease. HDL function represents an emerging concept in cardiovascular research.


This study investigated the association between HDL functionality and acute myocardial infarction (MI) independent of HDL-cholesterol plasma levels.


Participants (non-ST-segment elevation MI, non-STEMI, n = 41; STEMI, n = 37; non-MI patients, n = 33) from a prospective follow-up study enrolling patients with acute chest pain were matched for age and plasma HDL cholesterol. The in vitro capacity of HDL to (1) mediate cholesterol efflux from macrophage foam cells, (2) prevent low-density lipoprotein oxidation, and (3) inhibit TNF-α-induced vascular adhesion molecule-1 expression in endothelial cells was determined.


STEMI-HDL displayed reduced cholesterol efflux (P < .001) and anti-inflammatory functionality (P = .001), whereas the antioxidative properties were unaltered. Cholesterol efflux correlated with the anti-inflammatory HDL activity (P < .001). Not C-reactive protein levels, a marker of systemic inflammation, but specifically plasma myeloperoxidase levels were independently associated with impaired HDL function (efflux: P = .022; anti-inflammation: P < .001). Subjects in the higher risk quartile of efflux (odds ratio [OR], 5.66; 95% confidence interval [CI], 1.26-25.00; P = .024) as well as anti-inflammatory functionality of HDL (OR, 5.53; 95% CI, 1.83-16.73; P = .002) had a higher OR for MI vs those in the three lower risk quartiles combined.


Independent of plasma HDL cholesterol levels, 2 of 3 antiatherogenic HDL functionalities tested were significantly impaired in STEMI patients, namely cholesterol efflux and anti-inflammatory properties. Increased myeloperoxidase levels might represent a major contributing mechanism for decreased HDL functionality in MI patients.


Acute coronary syndrome; Cholesterol; Cholesterol efflux; HDL function; Inflammation; Oxidation


[Why do they do that?  The high carbohydrate diet was more a high sugar diet than other carbohydrates.]

Adverse effects on insulin secretion of replacing saturated fat with refined carbohydrate but not with monounsaturated fat: A randomized controlled trial in centrally obese subjects.

Chang LF, Vethakkan SR, Nesaretnam K, Sanders TA, Teng KT.

J Clin Lipidol. 2016 Nov - Dec;10(6):1431-1441.e1. doi: 10.1016/j.jacl.2016.09.006.

PMID: 27919361



Current dietary guidelines recommend the replacement of saturated fatty acids (SAFAs) with carbohydrates or monounsaturated fatty acids (MUFAs) based on evidence on lipid profile alone, the chronic effects of the mentioned replacements on insulin secretion and insulin sensitivity are however unclear.


To assess the chronic effects of the substitution of refined carbohydrate or MUFA for SAFA on insulin secretion and insulin sensitivity in centrally obese subjects.


Using a crossover design, randomized controlled trial in abdominally overweight men and women, we compared the effects of substitution of 7% energy as carbohydrate or MUFA for SAFA for a period of 6 weeks each. Fasting and postprandial blood samples in response to corresponding SAFA, carbohydrate, or MUFA-enriched meal-challenges were collected after 6 weeks on each diet treatment for the assessment of outcomes.


As expected, postprandial nonesterified fatty acid suppression and elevation of C-peptide, insulin and glucose secretion were the greatest with high-carbohydrate (CARB) meal. Interestingly, CARB meal attenuated postprandial insulin secretion corrected for glucose response; however, the insulin sensitivity and disposition index were not affected. SAFA and MUFA had similar effects on all markers except for fasting glucose-dependent insulinotropic peptide concentrations, which increased after MUFA but not SAFA when compared with CARB.


In conclusion, a 6-week lower-fat/higher-carbohydrate (increased by 7% refined carbohydrate) diet may have greater adverse effect on insulin secretion corrected for glucose compared with isocaloric higher-fat diets. In contrast, exchanging MUFA for SAFA at 7% energy had no appreciable adverse impact on insulin secretion.


Abdominal obesity; Carbohydrates; Central obesity; Gastrointestinal peptide; Gut hormone; Insulin secretion; Insulin sensitivity; Monounsaturated fatty acids; Saturated fatty acids; Type 2 diabetes


[i guess that if the population groups had been more numerous, more results would have been significant.]

Rural-to-urban migration and risk of hypertension: longitudinal results of the PERU MIGRANT study.

Bernabe-Ortiz A, Sanchez JF, Carrillo-Larco RM, Gilman RH, Poterico JA, Quispe R, Smeeth L, Miranda JJ.

J Hum Hypertens. 2016 Feb 11. doi: 10.1038/jhh.2015.124. [Epub ahead of print]

PMID: 26865219 Free PMC Article


Urbanization can be detrimental to health in populations due to changes in dietary and physical activity patterns. The aim of this study was to determine the effect of migration on the incidence of hypertension. Participants of the PERU MIGRANT study, that is, rural, urban and rural-to-urban migrants, were re-evaluated after 5 years after baseline assessment. The outcome was incidence of hypertension; and the exposures were study group and other well-known risk factors. Incidence rates, relative risks (RRs) and population attributable fractions (PAFs) were calculated. At baseline, 201 (20.4%), 589 (59.5%) and 199 (20.1%) participants were rural, rural-to-urban migrant and urban subjects, respectively. Overall mean age was 47.9 (s.d.±12.0) years, and 522 (52.9%) were female. Hypertension prevalence at baseline was 16.0% (95% confidence interval (CI) 13.7–18.3), being more common in urban group; whereas pre-hypertension was more prevalent in rural participants (P<0.001). Follow-up rate at 5 years was 94%, 895 participants were re-assessed and 33 (3.3%) deaths were recorded. Overall incidence of hypertension was 1.73 (95%CI 1.36–2.20) per 100 person-years. In multivariable model and compared with the urban group, rural group had a greater risk of developing hypertension (RR 3.58; 95%CI 1.42–9.06). PAFs showed high waist circumference as the leading risk factor for the hypertension development in rural (19.1%), migrant (27.9%) and urban (45.8%) participants. Subjects from rural areas are at higher risk of developing hypertension relative to rural–urban migrant or urban groups. Central obesity was the leading risk factor for hypertension incidence in the three population groups.


Determinants of day-night difference in blood pressure, a comparison with determinants of daytime and night-time blood pressure.

Musameh MD, Nelson CP, Gracey J, Tobin M, Tomaszewski M, Samani NJ.

J Hum Hypertens. 2016 Mar 17. doi: 10.1038/jhh.2016.14. [Epub ahead of print]

PMID: 26984683


Blunted day-night difference in blood pressure (BP) is an independent cardiovascular risk factor, although there is limited information on determinants of diurnal variation in BP. We investigated determinants of day-night difference in systolic (SBP) and diastolic (DBP) BP and how these compared with determinants of daytime and night-time SBP and DBP. We analysed the association of mean daytime, mean night-time and mean day-night difference (defined as (mean daytime-mean night-time)/mean daytime) in SBP and DBP with clinical, lifestyle and biochemical parameters from 1562 adult individuals (mean age 38.6) from 509 nuclear families recruited in the GRAPHIC Study. We estimated the heritability of the various BP phenotypes. In multivariate analysis, there were significant associations of age, sex, markers of adiposity (body mass index and waist-hip ratio), plasma lipids (total and low-density lipoprotein cholesterol and triglycerides), serum uric acid, alcohol intake and current smoking status on daytime or night-time SBP and/or DBP. Of these, only age (P=4.7 × 10-5), total cholesterol (P=0.002), plasma triglycerides (P=0.006) and current smoking (P=3.8 × 10-9) associated with day-night difference in SBP, and age (P=0.001), plasma triglyceride (P=2.2 × 10-5) and current smoking (3.8 × 10-4) associated with day-night difference in DBP. 24-h, daytime and night-time SBP and DBP showed substantial heritability (ranging from 18-43%). In contrast day-night difference in SBP showed a lower heritability (13%) while heritability of day-night difference in DBP was not significant. These data suggest that specific clinical, lifestyle and biochemical factors contribute to inter-individual variation in daytime, night-time and day-night differences in SBP and DBP. Variation in day-night differences in BP is largely non-genetic.

Link to comment
Share on other sites

The Oldest Old in the Emergency Department: Impact of Renal Function.

Brünnler T, Drey M, Dirrigl G, Weingart C, Rockmann F, Sieber C, Hoffmann U.

J Nutr Health Aging. 2016;20(10):1045-1050.

PMID: 27925145



The ageing population implicates an increasing numbers of older adults attending Emergency Departments (ED). We assessed the effect of estimated glomerular filtration rate as a predictor of clinical outcomes in oldest-old patients ≥ 85 years attending the ED in an university teaching hospital.


Within three years, 81831 patient contacts were made in our ED. 7799 (9.5%) were older than 85 years, in whom we analyzed the impact of renal function on various outcome parameters. Furthermore, this patient group was compared to the patients < 85 years.


Within the group of patients ≥ 85 years, not older age, but as denominator decreased glomerular filtration rate led to significant longer hospital stays. In addition, impaired kidney function was associated with lower heart rates, lower blood pressure, lower oxygenation, a higher rate of established ambulant care setting, as well as higher mortality. Compared to younger patients, the oldest-old significantly differed with regard to medical attribution (e.g. internal medicine, surgery), sex distribution, length of hospital stay, Manchester triage score, Glasgow Coma Scale, visual analogue pain scale, heart rate, blood pressure, oxygen saturation as well as fall prophylaxis, outpatient care, and presence of relatives.


In conclusion, in this large collective of oldest-old patients, impaired kidney function seems to be a more important determinant in adverse outcome and thus increased health care costs than age per se. Adapted strategies in EDs to adjust diagnostic and treatment strategies for this population are thus warranted.


Effects of a culturally adapted lifestyle intervention on cardio-metabolic outcomes: a randomized controlled trial in Iraqi immigrants to Sweden at high risk for Type 2 diabetes.

Siddiqui F, Kurbasic A, Lindblad U, Nilsson PM, Bennet L.

Metabolism. 2017 Jan;66:1-13. doi: 10.1016/j.metabol.2016.10.001.

PMID: 27923444



Middle-Eastern immigrants constitute a growing proportion of the Swedish population and are at high risk for Type 2 diabetes. This calls for a more proactive preventive approach for dealing with diabetes risk in this target group. The aim was to test the effect of a culturally adapted lifestyle intervention programme on changes in lifestyle habits and cardio-metabolic outcomes comparing an intervention group with a control group receiving usual care.


Citizens of Malmö, Sweden born in Iraq and at high risk for Type 2 diabetes (n=636) were invited. Participation rate was 15.1%. In all, 96 participants were randomized to the intervention group (n=50) or to the control group (n=46). The intervention group was offered seven group sessions addressing healthy diet and physical activity including one cooking class. Changes in body weight, physical activity levels and cardio-metabolic outcomes were evaluated using linear mixed-effects models.


The mean follow-up time was 3.9 and 3.5months in the intervention and control groups, respectively. The drop-out rate from baseline to the last visit was 30.0% in the intervention group (n=15) and 30.4% in the control group (n=14). The mean insulin sensitivity index increased significantly at follow-up in the intervention group compared to the control group (10.9% per month, p=0.005). The intervention group also reached a significant reduction in body weight (0.4% per month, p=0.004), body mass index (0.4% per month, p=0.004) and LDL-cholesterol (2.1% per month, p=0.036) compared to the control group. In total, 14.3% in the intervention group reached the goal to lose ≥5% of body weight versus none in the control group.


This culturally adapted lifestyle intervention programme shows a beneficial effect on insulin action, body weight reduction, as well as LDL-cholesterol reduction, in Middle-Eastern immigrants. The programme adapted to resources in primary health care provides tools for improved primary prevention and reduced cardio-metabolic risk in this high-risk group for Type 2 diabetes.


Body weight; Immigrants; Insulin sensitivity; Lifestyle; Randomized controlled trial


Tea Consumption Reduces the Incidence of Neurocognitive Disorders: Findings from the Singapore Longitudinal Aging Study.

Feng L, Chong MS, Lim WS, Gao Q, Nyunt MS, Lee TS, Collinson SL, Tsoi T, Kua EH, Ng TP.

J Nutr Health Aging. 2016;20(10):1002-1009.

PMID: 27925140



To examine the relationships between tea consumption habits and incident neurocognitive disorders (NCD) and explore potential effect modification by gender and the apolipoprotein E (APOE) genotype.


Population-based longitudinal study.


The Singapore Longitudinal Aging Study (SLAS).


957 community-living Chinese elderly who were cognitively intact at baseline.


We collected tea consumption information at baseline from 2003 to 2005 and ascertained incident cases of neurocognitive disorders (NCD) from 2006 to 2010. Odds ratio (OR) of association were calculated in logistic regression models that adjusted for potential confounders.


A total of 72 incident NCD cases were identified from the cohort. Tea intake was associated with lower risk of incident NCD, independent of other risk factors. Reduced NCD risk was observed for both green tea (OR=0.43) and black/oolong tea (OR=0.53) and appeared to be influenced by the changing of tea consumption habit at follow-up. Using consistent non-tea consumers as the reference, only consistent tea consumers had reduced risk of NCD (OR=0.39). Stratified analyses indicated that tea consumption was associated with reduced risk of NCD among females (OR=0.32) and APOE ε4 carriers (OR=0.14) but not males and non APOE ε4 carriers.


Regular tea consumption was associated with lower risk of neurocognitive disorders among Chinese elderly. Gender and genetic factors could possibly modulate this association.


[The below paper is pdf-availed.]

Effects of Calcium, Vitamin D, and Hormone Therapy on Cardiovascular Disease Risk Factors in the Women's Health Initiative: A Randomized Controlled Trial.

Schnatz PF, Jiang X, Aragaki AK, Nudy M, OʼSullivan DM, Williams M, LeBlanc ES, Martin LW, Manson JE, Shikany JM, Johnson KC, Stefanick ML, Payne ME, Cauley JA, Howard BV, Robbins J.

Obstet Gynecol. 2016 Dec 2. [Epub ahead of print]

PMID: 27926633



To analyze the treatment effect of calcium+vitamin D supplementation, hormone therapy, both, and neither on cardiovascular disease risk factors.


We conducted a prospective, randomized, double-blind, placebo-controlled trial among Women's Health Initiative (WHI) participants. The predefined primary outcome was low-density lipoprotein cholesterol (LDL-C).


Between September 1993 and October 1998, a total of 68,132 women aged 50-79 years were recruited and randomized to the WHI-Dietary Modification (n=48,835) and WHI-Hormone Therapy trials (n=27,347). Subsequently, 36,282 women from WHI-Hormone Therapy (16,089) and WHI-Dietary Modification (n=25,210) trials were randomized in the WHI-Calcium+Vitamin D trial to 1,000 mg elemental calcium carbonate plus 400 international units vitamin D3 daily or placebo. Our study group included 1,521 women who participated in both the hormone therapy and calcium+vitamin D trials and were in the 6% subsample of trial participants with blood sample collections at baseline and years 1, 3, and 6. The average treatment effect with 95% confidence interval, for LDL-C, compared with placebo, was -1.6, (95% confidence interval [CI] -5.5 to 2.2) mg/dL for calcium+vitamin D alone, -9.0 (95% CI -13.0 to -5.1) mg/dL for hormone therapy alone, and -13.8 (95% CI -17.8 to -9.8) mg/dL for the combination. There was no evidence of a synergistic effect of calcium+vitamin D+hormone therapy on LDL-C (P value for interaction=.26) except in those with low total intakes of vitamin D, for whom there was a significant synergistic effect on LDL (P value for interaction=.03).


Reductions in LDL-C were greater among women randomized to both calcium+vitamin D and hormone therapy than for those randomized to either intervention alone or to placebo. The treatment effect observed in the calcium+vitamin D+hormone therapy combination group may be additive rather than synergistic. For clinicians and patients deciding to begin calcium+vitamin D supplementation, current use of hormone therapy should not influence that decision.


Mitochondrial pyruvate carrier regulates autophagy, inflammation, and neurodegeneration in experimental models of Parkinson’s disease

Anamitra Ghosh1, Trevor Tyson1, Sonia George1, Erin N. Hildebrandt1, Jennifer A. Steiner1, Zachary Madaj2, Emily Schulz1, Emily Machiela1, William G. McDonald3, Martha L. Escobar Galvis1, Jeffrey H. Kordower1,4, Jeremy M. Van Raamsdonk1, Jerry R. Colca3 and Patrik Brundin1,*

Science Translational Medicine  07 Dec 2016:

Vol. 8, Issue 368, pp. 368ra174

DOI: 10.1126/scitranslmed.aag2210

A mitochondrial target for slowing Parkinson's disease

Currently, there are no disease-modifying treatments to stall progression of Parkinson’s disease (PD). A drug in development to treat diabetes might provide a new way to slow the progression of PD according to new work by Ghosh and colleagues. The drug, MSDC-0160, targets a recently identified carrier of pyruvate (a major substrate for energy production) into the mitochondria. Ghosh et al. now show that this drug, which attenuates the mitochondrial pyruvate carrier, blocks neurodegeneration in several different cellular and animal models of PD. Furthermore, cellular autophagy was restored, and neuroinflammation was reduced in two mouse models of PD. These results support continued investigations into whether the mitochondrial pyruvate carrier will be a useful therapeutic target in PD.


Mitochondrial and autophagic dysfunction as well as neuroinflammation are involved in the pathophysiology of Parkinson’s disease (PD). We hypothesized that targeting the mitochondrial pyruvate carrier (MPC), a key controller of cellular metabolism that influences mTOR (mammalian target of rapamycin) activation, might attenuate neurodegeneration of nigral dopaminergic neurons in animal models of PD. To test this, we used MSDC-0160, a compound that specifically targets MPC, to reduce its activity. MSDC-0160 protected against 1-methyl-4-phenylpyridinium (MPP+) insult in murine and cultured human midbrain dopamine neurons and in an α-synuclein–based Caenorhabditis elegans model. In 1-methyl-4-phenyl-1,2,3,6-tetrahydropyridine (MPTP)–treated mice, MSDC-0160 improved locomotor behavior, increased survival of nigral dopaminergic neurons, boosted striatal dopamine levels, and reduced neuroinflammation. Long-term targeting of MPC preserved motor function, rescued the nigrostriatal pathway, and reduced neuroinflammation in the slowly progressive Engrailed1 (En1+/−) genetic mouse model of PD. Targeting MPC in multiple models resulted in modulation of mitochondrial function and mTOR signaling, with normalization of autophagy and a reduction in glial cell activation. Our work demonstrates that changes in metabolic signaling resulting from targeting MPC were neuroprotective and anti-inflammatory in several PD models, suggesting that MPC may be a useful therapeutic target in PD.


Scanning Ultrasound (SUS) Causes No Changes to Neuronal Excitability and Prevents Age-Related Reductions in Hippocampal CA1 Dendritic Structure in Wild-Type Mice.

Hatch RJ, Leinenga G, Götz J.

PLoS One. 2016 Oct 11;11(10):e0164278. doi: 10.1371/journal.pone.0164278.

PMID: 27727310 Free PMC Article


Scanning ultrasound (SUS) is a noninvasive approach that has recently been shown to ameliorate histopathological changes and restore memory functions in an Alzheimer's disease mouse model. Although no overt neuronal damage was reported, the short- and long-term effects of SUS on neuronal excitability and dendritic tree morphology had not been investigated. To address this, we performed patch-clamp recordings from hippocampal CA1 pyramidal neurons in wild-type mice 2 and 24 hours after a single SUS treatment, and one week and 3 months after six weekly SUS treatments, including sham treatments as controls. In both treatment regimes, no changes in CA1 neuronal excitability were observed in SUS-treated neurons when compared to sham-treated neurons at any time-point. For the multiple treatment groups, we also determined the dendritic morphology and spine densities of the neurons from which we had recorded. The apical trees of sham-treated neurons were reduced at the 3 month time-point when compared to one week; however, surprisingly, no longitudinal change was detected in the apical dendritic trees of SUS-treated neurons. In contrast, the length and complexity of the basal dendritic trees were not affected by SUS treatment at either time-point. The apical dendritic spine densities were reduced, independent of the treatment group, at 3 months compared to one week. Collectively, these data suggest that ultrasound can be employed to prevent an age-associated loss of dendritic structure without impairing neuronal excitability.


Eric S. Kim, Kaitlin A. Hagan, Francine Grodstein, Dawn L. DeMeo, Immaculata De Vivo, and Laura D. Kubzansky

Optimism and Cause-Specific Mortality: A Prospective Cohort Study

Am. J. Epidemiol. first published online December 7, 2016 doi:10.1093/aje/kww182

Abbreviations: CI, confidence interval; HR, hazard ratio; MET, metabolic equivalent of task.


Growing evidence has linked positive psychological attributes like optimism to a lower risk of poor health outcomes, especially cardiovascular disease. It has been demonstrated in randomized trials that optimism can be learned. If associations between optimism and broader health outcomes are established, it may lead to novel interventions that improve public health and longevity. In the present study, we evaluated the association between optimism and cause-specific mortality in women after considering the role of potential confounding (sociodemographic characteristics, depression) and intermediary (health behaviors, health conditions) variables. We used prospective data from the Nurses’ Health Study (n = 70,021). Dispositional optimism was measured in 2004; all-cause and cause-specific mortality rates were assessed from 2006 to 2012. Using Cox proportional hazard models, we found that a higher degree of optimism was associated with a lower mortality risk. After adjustment for sociodemographic confounders, compared with women in the lowest quartile of optimism, women in the highest quartile had a hazard ratio of 0.71 (95% confidence interval: 0.66, 0.76) for all-cause mortality. Adding health behaviors, health conditions, and depression attenuated but did not eliminate the associations (hazard ratio = 0.91, 95% confidence interval: 0.85, 0.97). Associations were maintained for various causes of death, including cancer, heart disease, stroke, respiratory disease, and infection. Given that optimism was associated with numerous causes of mortality, it may provide a valuable target for new research on strategies to improve health.

Key words

health psychology optimism psychological well-being resilience


Rapamycin Reverses Metabolic Deficits in Lamin A/C-Deficient Mice.

Liao CY, Anderson SS, Chicoine NH, Mayfield JR, Academia EC, Wilson JA, Pongkietisak C, Thompson MA, Lagmay EP, Miller DM, Hsu YM, McCormick MA, O'Leary MN, Kennedy BK.

Cell Rep. 2016 Dec 6;17(10):2542-2552. doi: 10.1016/j.celrep.2016.10.040.

PMID: 27926859


The role of the mTOR inhibitor, rapamycin, in regulation of adiposity remains controversial. Here, we evaluate mTOR signaling in lipid metabolism in adipose tissues of Lmna-/- mice, a mouse model for dilated cardiomyopathy and muscular dystrophy. Lifespan extension by rapamycin is associated with increased body weight and fat content, two phenotypes we link to suppression of elevated energy expenditure. In both white and brown adipose tissue of Lmna-/- mice, we find that rapamycin inhibits mTORC1 but not mTORC2, leading to suppression of elevated lipolysis and restoration of thermogenic protein UCP1 levels, respectively. The short lifespan and metabolic phenotypes of Lmna-/- mice can be partially rescued by maintaining mice at thermoneutrality. Together, our findings indicate that altered mTOR signaling in Lmna-/- mice leads to a lipodystrophic phenotype that can be rescued with rapamycin, highlighting the effect of loss of adipose tissue in Lmna-/- mice and the consequences of altered mTOR signaling.


Lmna(−/−) mice; adiposity; lifespan; lipodystrophy; mTOR; progeria; rapamycin; thermoneutrality


Am J Forensic Med Pathol. 2012 Dec;33(4):362-7. doi: 10.1097/PAF.0b013e31823d298b.

Normal organ weights in men: part I-the heart.

Molina DK1, DiMaio VJ.

PMID: 22182983 


It has been shown that cardiac enlargement, whether hypertrophic or dilated, is an independent risk factor for sudden cardiac death, although the definition of what constitutes cardiac enlargement is not universally established. This study was designed to address this issue and to determine normal cardiac weights in adult men. A prospective study was undertaken of healthy men dying from sudden traumatic deaths aged 18 to 35 years. Cases were excluded if there was a history of medical illness including illicit drug use; prolonged medical treatment was performed; there was a prolonged period between the time of injury and death; body length and weight could not be accurately assessed; there was significant cardiac injury; or any illness or intoxication was identified after gross and microscopic analysis, including evidence of systemic disease. A total of 232 cases met the criteria for inclusion in the study during the approximately 6-year period of data collection from 2005 to 2011. The decedents had an average age of 23.9 years and ranged in length from 146 to 193 cm with an average length of 173 cm. Their weights ranged from 48.5 to 153 kg with an average weight of 76.4 kg. Most decedents (87%) died of either ballistic or blunt force (including craniocerebral) injuries. Overall, their heart weights ranged from 188 to 575 g with an average of 331 g and an SD of 56.7 g. Regression analysis was performed to assess the relationship between heart weight and body weight, body length, and body mass index and found insufficient associations to enable predictability. The authors, therefore, propose establishing a reference range for heart weight in men, much like those in use for other laboratory tests including hemoglobin, hematocrit, or glucose. A reference range (95% inclusion) of 233 to 383 g for the adult male human heart is proposed.


Normal organ weights in men: part II-the brain, lungs, liver, spleen, and kidneys.

Molina DK, DiMaio VJ.

Am J Forensic Med Pathol. 2012 Dec;33(4):368-72. doi: 10.1097/PAF.0b013e31823d29ad.

PMID: 22182984


Organomegaly can be a sign of disease and pathologic abnormality, although standard tables defining organomegaly have yet to be established and universally accepted. This study was designed to address the issue and to determine a normal weight for the major organs in adult human males. A prospective study of healthy men aged 18 to 35 years who died of sudden, traumatic deaths was undertaken. Cases were excluded if there was a history of medical illness including illicit drug use, if prolonged medical treatment was performed, if there was a prolonged period between the time of injury and death, if body length and weight could not be accurately assessed, or if any illness or intoxication was identified after gross and microscopic analysis including evidence of systemic disease. Individual organs were excluded if there was significant injury to the organ, which could have affected the weight. A total of 232 cases met criteria for inclusion in the study during the approximately 6-year period of data collection from 2005 to 2011. The decedents had a mean age of 23.9 years and ranged in length from 146 to 193 cm, with a mean length of 173 cm. The weight ranged from 48.5 to 153 kg, with a mean weight of 76.4 kg. Most decedents (87%) died of either ballistic or blunt force (including craniocerebral) injuries. The mean weight of the brain was 1407 g (range, 1070-1767 g), that of the liver was 1561 g (range, 838-2584 g), that of the spleen was 139 g (range, 43-344 g), that of the right lung was 445 g (range, 185-967 g), that of the left lung was 395 g (range, 186-885 g), that of the right kidney was 129 g (range, 79-223 g), and that of the left kidney was 137 g (range, 74-235 g). Regression analysis was performed and showed that there were insufficient associations between organ weight and body length, body weight, and body mass index to allow for predictability. The authors, therefore, propose establishing a reference range for organ weights in men, much like those in use for other laboratory tests including hemoglobin, hematocrit, or glucose. The following reference ranges (95% inclusion) are proposed: brain, 1179-1621 g; liver, 968-1860 g; spleen, 28-226 g; right lung, 155-720 g; left lung, 112-675 g; right kidney, 81-160 g; and left kidney, 83-176 g.

Edited by AlPater
Link to comment
Share on other sites

CD4:8 Ratio Above 5 Is Associated With All-Cause Mortality in CMV-Seronegative Very Old Women: Results From the BELFRAIL Study.

Adriaensen W, Pawelec G, Vaes B, Hamprecht K, Derhovanessian E, van Pottelbergh G, Degryse JM, Matheï C.

J Gerontol A Biol Sci Med Sci. 2016 Dec 7. pii: glw215. [Epub ahead of print]

PMID: 27927759


The occurrence and general applicability of the CD4:8 ratio as a surrogate predictor of mortality among the oldest old have only been tested in a few longitudinal studies. Here, the predictive value of CD4:8 ratio for mortality with respect to the role of cytomegalovirus (CMV) infection was investigated. Using polychromatic flow cytometry, the CD4:8 ratio and T-cell subsets of 235 individuals aged 81.5 years or older were analyzed, and mortality data were collected after a mean period of 3.3 years. The hazard for all-cause mortality adjusted for age, comorbidity, and CMV serostatus increased 1.53-fold (95% CI: 0.94-2.51) with every increment in the CD4:8 ratio from R < 1, to 1 < R < 5 and R > 5 among women. A negative hazard ratio of 0.50 for CMV seropositivity in women indicated an apparently protective effect of this virus. In men, no associations with survival were observed. No mediation effect could be found for the CD4:8 ratio with respect to the relationship between CMV serostatus and mortality. Very elderly CMV-negative women with a R > 5 experienced the highest mortality rates, independent of age and comorbidity. The associations of CMV serostatus and CD4:8 ratio with mortality seem to reflect distinct pathways mediating life span in very old humans.


Cytomegalovirus; Mortality; Oldest old; Sex differences; T-cell distribution


[The below paper is pdf-availed.]

"Still feeling healthy after all these years": The paradox of subjective stability versus objective decline in very old adults' health and functioning across five years.

Wettstein M, Schilling OK, Wahl HW.

Psychol Aging. 2016 Dec;31(8):815-830. DOI: 10.1037/pag0000137

PMID: 27929338


Indicators of objective functioning, such as everyday competence or sensory and sensorimotor functions, typically show pronounced declines in very old age. However, less is known about how very old adults perceive their abilities across multiple domains of health and functioning and to what extent changes in perceived functioning mirror changes in objective functioning. We compared changes in perceived versus objective health and functioning indicators among very old adults (n = 124; baseline age between 87 and 97 years, M = 90.56 years, SD = 2.92 years) across 11 measurement occasions, spanning approximately 5 years. Functioning was assessed by self-reports (subjective health, subjective movement ability, subjective vision, and number of perceived symptoms) and by objective and mostly performance-based tests (everyday competence, visual acuity, chair stand test, and grip strength). All objective measures exhibited a significant mean-level decline across 5 years, whereas most subjective indicators did not reveal significant mean-level changes. Interindividual variation in intraindividual change patterns was considerable in most domains. Correlations between trajectories of the different indicators were mostly weak, and predicting late-life changes in subjective functioning by time-varying objective functioning indicators accounted for only modest amounts of variance. Our findings suggest that there is a somewhat paradoxical pattern of discrepant late-life change trends in subjective versus objective indicators of health and functioning. We argue that a differentiated understanding of the fourth age is required and that common health definitions frequently applied to old and very old age need to be challenged.


Daily milk consumption and all-cause mortality, coronary heart disease and stroke: a systematic review and meta-analysis of observational cohort studies.

Mullie P, Pizot C, Autier P.

BMC Public Health. 2016 Dec 8;16(1):1236.

PMID: 27927192



Observational studies and meta-analyses relating milk consumption by adults to all-cause mortality, coronary heart disease and stroke have obtained contradictory results. Some studies found a protective effect of milk consumption, whilst other found an increased risk.


We performed a systematic literature search until June 2015 on prospective studies that looked at milk consumption, all-cause mortality, coronary heart disease and stroke. Random-effect meta-analyses were performed with dose-response.


Twenty-one studies involving 19 cohorts were included in this meta-analysis, 11 on all-cause mortality, 9 on coronary heart disease, and 10 on stroke. Milk intake ranged from 0 to 850 mL/d. The summary relative risk (SRR) for 200 mL/d milk consumption was 1.01 (95% CI: 0.96-1.06) for all-cause mortality, 1.01 (95% CI: 0.98-1.05) for fatal and non fatal coronary heart disease, and 0.91 (95% CI: 0.82-1.02) for fatal and non fatal stroke. Stratified analyses by age, Body Mass Index, total energy intake and physical acitivity did not alter the SRR estimates. The possibility of publication bias was found for all cause mortality and for stroke, indicating a gap in data that could have suggested a higher risk of these conditions with increased milk consumption.


We found no evidence for a decreased or increased risk of all-cause mortality, coronary heart disease, and stroke associated with adult milk consumption. However, the possibility cannot be dismissed that risks associated with milk consumption could be underestimated because of publication bias.


Dietary magnesium intake and the risk of cardiovascular disease, type 2 diabetes, and all-cause mortality: a dose-response meta-analysis of prospective cohort studies.

Fang X, Wang K, Han D, He X, Wei J, Zhao L, Imam MU, Ping Z, Li Y, Xu Y, Min J, Wang F.

BMC Med. 2016 Dec 8;14(1):210.

PMID: 27927203



Although studies have examined the association between dietary magnesium intake and health outcome, the results are inconclusive. Here, we conducted a dose-response meta-analysis of prospective cohort studies in order to investigate the correlation between magnesium intake and the risk of cardiovascular disease (CVD), type 2 diabetes (T2D), and all-cause mortality.


PubMed, EMBASE, and Web of Science were searched for articles that contained risk estimates for the outcomes of interest and were published through May 31, 2016. The pooled results were analyzed using a random-effects model.


Forty prospective cohort studies totaling more than 1 million participants were included in the analysis. During the follow-up periods (ranging from 4 to 30 years), 7678 cases of CVD, 6845 cases of coronary heart disease (CHD), 701 cases of heart failure, 14,755 cases of stroke, 26,299 cases of T2D, and 10,983 deaths were reported. No significant association was observed between increasing dietary magnesium intake (per 100 mg/day increment) and the risk of total CVD (RR: 0.99; 95% CI, 0.88-1.10) or CHD (RR: 0.92; 95% CI, 0.85-1.01). However, the same incremental increase in magnesium intake was associated with a 22% reduction in the risk of heart failure (RR: 0.78; 95% CI, 0.69-0.89) and a 7% reduction in the risk of stroke (RR: 0.93; 95% CI, 0.89-0.97). Moreover, the summary relative risks of T2D and mortality per 100 mg/day increment in magnesium intake were 0.81 (95% CI, 0.77-0.86) and 0.90 (95% CI, 0.81-0.99), respectively.


Increasing dietary magnesium intake is associated with a reduced risk of stroke, heart failure, diabetes, and all-cause mortality, but not CHD or total CVD. These findings support the notion that increasing dietary magnesium might provide health benefits.


All-cause mortality; Cardiovascular disease; Magnesium; Meta-analysis; Type 2 diabetes


Clinical and Metabolic Response to Selenium Supplementation in Pregnant Women at Risk for Intrauterine Growth Restriction: Randomized, Double-Blind, Placebo-Controlled Trial.

Mesdaghinia E, Rahavi A, Bahmani F, Sharifi N, Asemi Z.

Biol Trace Elem Res. 2016 Dec 7. [Epub ahead of print]

PMID: 27928721


Data on the effects of selenium supplementation on clinical signs and metabolic profiles in women at risk for intrauterine growth restriction (IUGR) are scarce. This study was designed to assess the effects of selenium supplementation on clinical signs and metabolic status in pregnant women at risk for IUGR. This randomized double-blind placebo-controlled clinical trial was performed among 60 women at risk for IUGR according to abnormal uterine artery Doppler waveform. Participants were randomly assigned to intake either 100 μg selenium supplements as tablet (n = 30) or placebo (n = 30) for 10 weeks between 17 and 27 weeks of gestation. After 10 weeks of selenium administration, a higher percentage of women in the selenium group had pulsatility index (PI) of <1.45) (P = 0.002) than of those in the placebo group. In addition, changes in plasma levels of total antioxidant capacity (TAC) (P < 0.001), glutathione (GSH) (P = 0.008), and high-sensitivity C-reactive protein (hs-CRP) (P = 0.004) in the selenium group were significant compared with the placebo group. Additionally, selenium supplementation significantly decreased serum insulin (P = 0.02), homeostasis model of assessment-estimated insulin resistance (HOMA-IR) (P = 0.02), and homeostatic model assessment for B-cell function (HOMA-B) (P = 0.02) and significantly increased quantitative insulin sensitivity check index (QUICKI) (P = 0.04) and HDL-C levels (P = 0.02) compared with the placebo. We did not find any significant effect of selenium administration on malondialdehyde (MDA), nitric oxide (NO), fasting plasma glucose (FPG), and other lipid profiles. Overall, selenium supplementation in pregnant women at risk for IUGR resulted in improved PI, TAC, GSH, hs-CRP, and markers of insulin metabolism and HDL-C levels, but it did not affect MDA, NO, FPG, and other lipid profiles.


Antioxidant; Intrauterine growth restriction; Metabolic profiles; Oxidative stress; Pregnant women; Selenium


Dietary protein and fat intake in relation to risk of colorectal adenoma in Korean.

Yang SY, Kim YS, Lee JE, Seol J, Song JH, Chung GE, Yim JY, Lim SH, Kim JS.

Medicine (Baltimore). 2016 Dec;95(49):e5453.

PMID: 27930524


Consumption of red meat and alcohol are known risk factors for colorectal cancer, but associations for dietary fat remain unclear. We investigated the associations of dietary fat, protein, and energy intake with prevalence of colorectal adenoma.We performed a prospective cross-sectional study on asymptomatic persons who underwent a screening colonoscopy at a single center during a routine health check-up from May to December 2011. Dietary data were obtained via a validated Food Frequency Questionnaire (FFQ), assisted by a registered dietician. We also obtained information on alcohol consumption and smoking status, and measured metabolic syndrome markers including abdominal circumference, blood pressure, fasting glucose, serum triglyceride and high-density lipoprotein cholesterol. We calculated odds ratio (OR) and 95% confidence interval (CI) to evaluate the associations using the polytomous logistic regression models. As a secondary analysis, we also conducted a matched analysis, matched by age and sex (557 cases and 557 non-cases).The study sample included 557 cases (406 males and 151 females) with histopathologically confirmed colorectal adenoma, and 1157 controls (650 males and 507 females). The proportion of advanced adenoma was 28.1% of men and 18.5% of female, respectively. Although vegetable protein intake was inversely associated with the prevalence of colorectal adenoma, further adjustment for potential confounding factors attenuated the association, resulting in no significant associations. There were no significant associations between dietary fat intake and colorectal adenoma in energy-adjusted models. For vegetable protein in women, the OR for the comparison of those in the highest tertile with those in the lowest tertile was 0.47 (95% CI 0.25-0.91, P for trend = 0.07) after adjustment for total energy intake. However, after controlling for metabolic syndrome markers, body mass index, smoking status, alcohol consumption, and family history of colorectal adenoma, which were all significantly high in the colorectal adenoma patients group, the association became attenuated (OR 0.54, 95% CI 0.27-1.11, P for trend = 0.13).In conclusion, we did not observe the significant associations for intakes of total energy, total, animal and vegetable fats, and total, animal and vegetable proteins in relation to colorectal adenoma prevalence.


Diet as a system: an observational study investigating a multi-choice system of moderately restricted low-protein diets.

Piccoli GB, Nazha M, Capizzi I, Vigotti FN, Scognamiglio S, Consiglio V, Mongilardi E, Bilocati M, Avagnina P, Versino E.

BMC Nephrol. 2016 Dec 7;17(1):197.

PMID: 27927186



There is no single, gold-standard, low-protein diet (LPD) for CKD patients; the best compliance is probably obtained by personalization. This study tests the hypothesis that a multiple choice diet network allows patients to attain a good compliance level, and that, in an open-choice system, overall results are not dependent upon the specific diet, but upon the clinical characteristics of the patients.


Observational study: Three LPD options were offered to all patients with severe or rapidly progressive CKD: vegan diets supplemented with alpha-ketoacids and essential aminoacids; protein-free food in substitution of normal bread and pasta; other (traditional, vegan non supplemented and tailored). Dialysis-free follow-up and survival were analyzed by Kaplan Meier curves according to diet, comorbidity and age. Compliance and metabolic control were estimated in 147 subjects on diet at March 2015, with recent complete data, prescribed protein intake 0.6 g/Kg/day. Protein intake was assessed by Maroni Mitch formula.


Four hundreds and forty nine patients followed a LPD in December, 2007- March, 2015 (90% moderately restricted LPDs, 0.6 g/Kg/day of protein, 10% at lower targets); age (median 70 (19-97)) and comorbidity (Charlson index: 7) characterized our population as being in line with the usual CKD European population. Median e-GFR at start of the diet was 20 mL/min, 33.2% of the patients were diabetics. Baseline data differ significantly across diets: protein-free schemas are preferred by older, high-comorbidity patients (median age 76 years, Charlson index 8, GFR 20.5 mL/min, Proteinuria: 0.3 g/day), supplemented vegan diets by younger patients with lower GFR and higher proteinuria (median age 65 years, Charlson index 6, GFR 18.9 mL/min; Proteinuria: 1.2 g/day); other diets are chosen by an intermediate population (median age 71 years, Charlson index 6; GFR 22.5 mL/min; Proteinuria: 0.9 g/day); (p <0.001 for age, Charlson index, proteinuria, GFR). Adherence was good, only 1.1% of the patients were lost to follow-up and protein intake was at target in most of the cases with no differences among LPDs (protein intake: 0.47 (0.26-0.86) g/Kg/day). After adjustment for confounders, and/or selection of similar populations, no difference in mortality or dialysis start was observed on the different LPDs. Below the threshold of e-GFR 15 mL/min, 50% of the patients remain dialysis free for at least two years.


A multiple choice LPD system may allow reaching good adherence, without competition among diets, and with promising results in terms of dialysis-free follow-up. The advantages with respect to a non-customized approach deserve confirmation in further comparative studies or RCTs.


CKD; Dialysis; Low-protein Diet; Mortality


The prebiotic concept and human health: a changing landscape with riboflavin as a novel prebiotic candidate?

Steinert RE, Sadaghian Sadabad M, Harmsen HJ, Weber P.

Eur J Clin Nutr. 2016 Dec;70(12):1348-1353. doi: 10.1038/ejcn.2016.119. Review.

PMID: 27380884


Emerging evidence suggests that the gut microbiota has a critical role in both the maintenance of human health and the pathogenesis of many diseases. Modifying the colonic microbiota using functional foods has attracted significant research effort and product development. The pioneering concept of prebiotics, as introduced by Gibson and Roberfroid in the 1990s, emphasized the importance of diet in the modulation of the gut microbiota and its relationships to human health. Increasing knowledge of the intestinal microbiota now suggests a more comprehensive definition. This paper briefly reviews the basics of the prebiotic concept with a discussion of recent attempts to refine the concept to open the door for novel prebiotic food ingredients, such as polyphenols, minerals and vitamins.


Effect of magnesium supplementation on glucose metabolism in people with or at risk of diabetes: a systematic review and meta-analysis of double-blind randomized controlled trials.

Veronese N, Watutantrige-Fernando S, Luchini C, Solmi M, Sartore G, Sergi G, Manzato E, Barbagallo M, Maggi S, Stubbs B.

Eur J Clin Nutr. 2016 Dec;70(12):1354-1359. doi: 10.1038/ejcn.2016.154. Review.

PMID: 27530471


Although higher dietary intakes of magnesium (Mg) seem to correspond to lower diabetes incidence, research concerning Mg supplementation in people with or at risk of diabetes is limited. Thus, we aimed to investigate the effect of oral Mg supplementation on glucose and insulin-sensitivity parameters in participants with diabetes or at high risk of diabetes compared with placebo. A literature search in PubMed, EMBASE, SCOPUS, Cochrane Central Register of Controlled Trials and Clinicaltrials.gov without language restriction, was undertaken. Eligible studies were randomized controlled trials (RCTs) investigating the effect of oral Mg supplementation vs placebo in patients with diabetes or at high risk of diabetes. Standardized mean differences (SMD) and 95% confidence intervals (CIs) were used for summarizing outcomes with at least two studies; other outcomes were summarized descriptively. Eighteen RCTs (12 in people with diabetes and 6 in people at high risk of diabetes) were included. Compared with placebo (n=334), Mg treatment (n=336) reduced fasting plasma glucose (studies=9; SMD=-0.40; 95% CI: -0.80 to -0.00; I2=77%) in people with diabetes. In conditions in people at high risk of diabetes (Mg: 226; placebo=227 participants), Mg supplementation significantly improved plasma glucose levels after a 2 h oral glucose tolerance test (three studies; SMD=-0.35; 95% CI: -0.62 to -0.07; I2=0%) and demonstrated trend level reductions in HOMA-IR (homeostatic model assessment-insulin resistance; five studies; SMD=-0.57; 95% CI: -1.17 to 0.03; I2=88%). Mg supplementation appears to have a beneficial role and improves glucose parameters in people with diabetes and also improves insulin-sensitivity parameters in those at high risk of diabetes.



Associations of dietary intakes of anthocyanins and berry fruits with risk of type 2 diabetes mellitus: a systematic review and meta-analysis of prospective cohort studies.

Guo X, Yang B, Tan J, Jiang J, Li D.

Eur J Clin Nutr. 2016 Dec;70(12):1360-1367. doi: 10.1038/ejcn.2016.142. Review.

PMID: 27530472


To investigate the associations of dietary intakes of anthocyanins and berry fruits with type 2 diabetes mellitus (T2DM) risk and to evaluate the potential dose-response relationships based on prospective cohort studies. Cochrane library, Embase and PubMed databases were systematically searched up to Jan 2016 for relevant original studies. Summary relative risks (RRs) were calculated with a random effects model comparing the highest with lowest category. Dose-response was estimated using restricted cubic spline regression models. Three cohort studies reporting dietary anthocyanin intake with 200 894 participants and 12 611 T2DM incident cases, and five cohort studies reporting berry intake with 194 019 participants and 13 013 T2DM incident cases were investigated. Dietary anthocyanin consumption was associated with a 15% reduction of T2DM risk (summary RR=0.85; 95% confidence interval (CI): 0.80-0.91; I2=14.5%). Consumption of berries was associated with an 18% reduction of T2DM risk (summary RR=0.82, 95% CI: 0.76-0.89; I2=48.6%). Significant curvilinear associations were found between dietary intake of anthocyanins (P for nonlinearity=0.006) and berries (P for nonlinearity=0.028) and T2DM risk, respectively. The risk of T2DM was decreased by 5%, with a 7.5 mg/day increment of dietary anthocyanin intake (RR=0.95; 95% CI: 0.93-0.98; I2=0.00%) or with a 17 g/day increment of berry intake (RR=0.95, 95% CI: 0.91-0.99; I2=0.00%), respectively. Higher dietary intakes of anthocyanins and berry fruits are associated with a lower T2DM risk.


The prebiotic concept and human health: a changing landscape with riboflavin as a novel prebiotic candidate?

Steinert RE, Sadaghian Sadabad M, Harmsen HJ, Weber P.

Eur J Clin Nutr. 2016 Dec;70(12):1348-1353. doi: 10.1038/ejcn.2016.119. Review.

PMID: 27380884


Emerging evidence suggests that the gut microbiota has a critical role in both the maintenance of human health and the pathogenesis of many diseases. Modifying the colonic microbiota using functional foods has attracted significant research effort and product development. The pioneering concept of prebiotics, as introduced by Gibson and Roberfroid in the 1990s, emphasized the importance of diet in the modulation of the gut microbiota and its relationships to human health. Increasing knowledge of the intestinal microbiota now suggests a more comprehensive definition. This paper briefly reviews the basics of the prebiotic concept with a discussion of recent attempts to refine the concept to open the door for novel prebiotic food ingredients, such as polyphenols, minerals and vitamins.


Associations between company at dinner and daily diet quality in Dutch men and women from the NQplus study.

van Lee L, Geelen A, Hooft van Huysduynen EJ, de Vries JH, van 't Veer P, Feskens EJ.

Eur J Clin Nutr. 2016 Dec;70(12):1368-1373. doi: 10.1038/ejcn.2016.123.

PMID: 27406161



Consuming the evening meal in the company of others has been associated with overall diet quality. Nevertheless, studies on the association between type of company at dinner and diet quality in adults are scarce.


Dutch men (n=895) and women (n=845) aged between 20 and 70 years, included in a population-based observational study, were studied. Dietary intake was assessed by multiple 24-h recalls (6013 recalls) to estimate the Dutch Healthy Diet index (0-80 points) representing daily diet quality. Sex-specific linear mixed models adjusting for covariates were calculated. Out-of-home dinners and company at dinner were strongly associated (r=0.66), and hence in additional analyses, out-of-home dinners were excluded to avoid multicollinearity.


Among men, daily diet quality was similar when dinners were consumed in company or consumed alone, but higher when dinner was accompanied by family (mean 46.0, s.e. 0.3) than when dinner was accompanied by others (mean 42.3, s.e. 0.7; P=0.001). Adjustment for dinner location attenuated this association, but it remained significant when excluding out-of-home dinners. Among women, daily diet quality was lower when dinner was consumed in company (mean 48.9, s.e. 0.3) than when consumed alone (mean 51.1, s.e. 0.6; P<0.001). Dinners consumed in the company of family were associated with higher daily diet quality (mean 49.3, s.e. 0.4) than dinners consumed with others (mean 45.7, s.e. 0.6; P=0.001). These associations persisted when excluding out-of-home dinners.


Only among women, dinners consumed alone as compared with dinners in company were associated with higher diet quality. In both men and women, dinners consumed with family were associated with higher diet quality as compared with dinners with others.

Link to comment
Share on other sites

Supersize the label The effect of prominent calorie-labelling on sales

Charoula K. Nikolaou, Michael McPartland, Livia Demkova, Michael EJ. Lean


    •Obesity prevention is the next rationale step in tackling the obesity epidemic

    •Prominent calorie-labelling could help customers to choose lower calorie food items

    •Calorie-Labelling seems to be effective when noticed

    •Calorie labelling should be part of anti-obesity strategies along with guidelines on its formatting


Calorie-labelling has been suggested as an anti-obesity measure but evidence on its impact is scarce and formatting guidance not well-defined. This study tested the impact of prominent calorie-labelling on sales of the labelled items. Prominent calorie labels were posted in front of two popular items for a period of a month. Sales were recorded for two consecutive months, prior to and during labelling. Muffins sales (the higher calorie-item) fell by 30% while sales of scones rose by 4%, a significant difference (X2 = 10.258, p=0.0014). Calorie-labelling is effective when noticed. Wider-adoption of calorie-labelling for all food-business and strengthening legislation with formatting guidelines should be the next step in public health policy.


Anti-Aging Benefits of Walnuts

Posted on Dec. 9, 2016, 6 a.m. in Functional Foods Anti-Aging Cardio-Vascular Nutrition

Recent study explores the benefits of walnuts in regards to age-related health issues.

 Anti-Aging Benefits of Walnuts

Walnuts - image from Shutterstock

Previous studies have shown that walnuts may slow cancer and improve memory, concentration, and information processing speed, as well as lowering diabetes risk and boosting male fertility. Now, more recently, initial findings from the Walnuts and Healthy Aging (WAHA) study presented at Experimental Biology 2016 (EB), are showing that daily walnut consumption positively impacts blood cholesterol levels, without adverse effects on body weight among older adults. The WAHA study is a dual site 2-year clinical trial, conducted by researchers from the Hospital Clinic of Barcelona and Loma Linda University and is focused on determining the effect that walnuts have on health issues that are age-related.

The 707 older adults who participated in the study consumed either a daily dose of walnuts (compiling less than 15% of daily caloric intake), or a diet absent of all nuts. There was no guidance given, as to suggested total calorie and macronutrient intake, or food substitutions for walnuts. After one year, there was no evident effect on body weight, triglycerides, and HDL cholesterol, for either group. However, those whose diets consisted of the walnuts resulted in significant reductions in LDL cholesterol in comparison with the nut-free diet.

Additional new research abstracts presented at the EB meeting included information regarding walnuts and the following:

Gut health - A recent study conducted by researchers at the Agricultural Research Service, United States Department of Agriculture (USDA) found that eating 1.5 ounces of walnuts daily significantly affects the bacteria in the human gut, in a manner that is favorable to decreasing inflammation and cholesterol, which are two known indicators of heart health.

Hunger and Satiety - For the first time, it has been shown that the types of fat consumed on a daily basis can change long-term appetite responses, such as hunger and satiety. Researchers from the University of Georgia found that eating a diet high in polyunsaturated fat, after meals rich in saturated fat, favorably alters hunger and satiety markers. At 13 grams per ounce, walnuts are a great choice for getting more polyunsaturated fat.

Metabolic Health - A recent study conducted at Oregon State University found that a diet consisting of walnuts supplemented with polyphenol-rich foods such as raspberries, cherries or green tea may help reduce inflammation. This study was conducted on mice who were given walnuts on their own or along with polyphenol-rich foods, however, as it was performed on animals, the results can not yet be considered applicable to humans.

Definitive scientific conclusions cannot yet be drawn from the abstracts presented at EB 2016, however these findings help advance the knowledge of possible benefits of consuming walnuts as part of a healthy diet.



1. Ros E, Rajaram S, Sala-Vila A, et al. 

Effect of a 1-Year Walnut Supplementation on Blood Lipids among Older Individuals: Findings from the Walnuts and Healthy Aging (WAHA) study.

FASEB J. 2016;30(Supp 1)293.4.


Background In small well-controlled short-term feeding studies conducted in middle-aged adults, nut-enriched diets have resulted in cholesterol lowering and absence of weight gain compared to control diets. There are no long-term data on lipid changes when supplementing the diet with nuts in older adults.

Objective To assess whether adding walnuts to the daily diet of free-living healthy older adults for 1 year will result in changes in blood lipids compared to the usual diet without nuts.

Design Within the context of the WAHA study, a two-center clinical trial aimed at determining the effect of a 2-year walnut diet on age-related outcomes, we randomized 707 free-living older but healthy individuals (352 in Barcelona, 355 in Loma Linda; 67% women, mean age 69 y, mean BMI 27.3 kg/m2) to supplementation of the usual diet with daily doses of walnuts at ≈15% of energy or to usual diet without nuts (control). Participants had frequent (once every two months) assessments by dietitians; no advice was given on total energy/macronutrient intake or food substitution for walnuts. At baseline and 1 year serum lipids were analyzed. Between-group differences in outcomes were analyzed by analysis of covariance with adjustment for sex, age, center and baseline levels; lipid values were additionally adjusted by changes in statin treatment.

Results Complete 1-year data were available for 514 participants (260 walnut diet, 254 control diet). There were 137 participants pending 1-year assessment and 56 dropouts for various reasons. The walnut diet was well tolerated and the proportion of α-linolenic acid in red blood cells increased in the walnut group by 0.162% (95% CI, 0.143–0.181) and in the control group by 0.015% (CI, −0.005–0.035) (P<0.001), indicating good compliance with the intervention. Changes in blood lipids were (mean±SEM) −7.5±1.6 versus −0.4±1.6 mg/dL (P=0.003) for total cholesterol, −7.1±1.3 versus −1.1±1.4 mg/dL (P=0.002) for LDL-cholesterol, and −0.15±0.31 versus −0.05±0.31 (P=0.025) for the total cholesterol/HDL-cholesterol ratio, in the walnut versus control diets, respectively. No between-diet differences were observed for triglycerides or HDL-cholesterol. Lipid effects were similar in the two centers. No differences in body weight was observed between treatments.

Conclusion Incorporating daily doses of walnuts to the habitual diet of older free-living individuals for 1 year was well tolerated and resulted in significant LDL-cholesterol reduction without adverse effects on body weight. Results were similar with a Mediterranean or a Western background diet

Support or Funding Information

The WAHA study is funded by the California Walnut Commission


2. Guetterman HM, Swanson KS, Novotny JA, et al. 

Walnut Consumption Influences the Human Gut Microbiome . 

FASEB J. 2016;30(Supp 1)406.2.


Background Diet and the microbiome play an important role in human health. However, the interplay of diet, the microbiome, and health and disease is under-investigated. Furthermore, there is a dearth of information on the impact of specific foods on the gastrointestinal microbiome. Diets rich in nuts have beneficial effects on cardiovascular disease risk factors, including reduced LDL cholesterol and inflammation. The exact mechanisms of these cardioprotective effects may be linked to the gastrointestinal microbiome.

Objective We aimed to assess the impact of walnut consumption on the human gastrointestinal microbiome as a means for understanding the underlying mechanisms of the cardiometabolic protective effects of nut consumption.

Methods A controlled-feeding, randomized, crossover study was undertaken in healthy adult males (n=10) and females (n=8). Study participants received isocaloric treatment diets containing 0 grams or 42 grams/day of walnut pieces for a period of 3 weeks with a 1-week washout between periods. Blood, urine, and fecal samples were collected at the beginning and end of each treatment period for metabolic, immunologic, and microbial analyses. Barcoded amplicon pools of bacterial, fungal, and archaeal sequences were generated using a Fluidigm Access Array system prior to high-throughput sequencing on an Illumina MiSeq. Sequence data were analyzed with QIIME 1.8. Data were analyzed using the mixed-model procedure of SAS with post-hoc Tukey adjustments for multiple comparisons (SAS 9.4).

Results Principal coordinates analysis (PCoA) of UniFrac distances between samples based on their 97% OTU composition and abundances indicated that bacterial communities were impacted by walnut consumption (p<0.05). Walnut consumption increased (p=0.05) the relative abundance of Clostridium from baseline to end. Compared to the end of the control period, walnut consumption increased the relative abundances of Roseburia (p=0.02) and Dialister (p=0.02), and tended to increase the proportion of Faecalibacterium (p=0.07). The relative abundance of Oscillospira tended (p=0.08) to decrease from the end of the control period (no walnuts) compared to the end of the treatment period when walnuts were consumed.

Conclusions These novel results revealed that walnut consumption significantly affects the microbial composition of the human gastrointestinal microbiota. More specifically, walnut consumption increased bacterial genera associated with anti-inflammatory properties and production of the short-chain fatty acids butyrate and propionate. Short-chain fatty acids are purported to mediate some of the hypocholesterolemic effects of dietary fibers. These data help fill the gap in knowledge related to the cardioprotective effects of nut consumption. Further research on the bacterial fermentation of walnuts is needed to determine if the changes in these microbial communities translate to increased production of short-chain fatty acids.

Support or Funding Information

This study was funded by USDA and California Walnut Commission.


3. Cooper JA, Stevenson JL, Paton CM. 

Hunger and satiety responses to saturated fat-rich meals before and after a high PUFA diet.

FASEB J. 2016;30(Supp 1)405.7.


Objective High-fat meals rich in poly-unsaturated fats (PUFAs) have been shown to induce greater satiety than meals rich in saturated (SFA) or mono-unsaturated (MUFA) fats in an acute setting. It remains unknown, however, what the long-term diet effects of PUFAs are on hunger and satiety.

Purpose To determine hunger and satiety responses to high SFA test meals before and after a 7-day high PUFA diet.

Methods Eighteen, normal weight, sedentary adults (BMI=21.4±2.3kg/m2; age=22.7±4.1y) were randomly assigned to either a PUFA or control diet. Following a 3-day lead-in diet, participants reported for the baseline visit where anthropometrics and a fasting blood sample were collected. Participants then consumed two high-fat meals (breakfast and lunch) that were rich in SFAs (47% of total energy as SFA). Postprandial blood draws were collected approximately every 30 minutes for 4 hour after each HF meal, for a total of 8 hours. Participants then consumed a high PUFA diet (50% carbohydrate, 15% protein, 35% fat, of which 21% of total energy was PUFA,) or control diet (50% carbohydrate, 15% protein, 35% fat, of which 7% of total energy was PUFA) for the next 7 days. All food and drink were provided by research personnel, and diets were designed to meet each participants estimated daily energy needs. Following the 7-day diet, participants completed the post-visit which included same procedures as the baseline visit (2 high SFA meals and 8 hours of blood draws). Blood samples were analyzed for ghrelin (hunger hormone) and Peptide YY (PYY), insulin, and leptin (satiety hormones).

Results For fasting concentrations, the PUFA group showed a significant decrease in ghrelin (1147.1±101.4 vs. 967.0±85.9pg/mL, p<0.05) and significant increase in PYY (89.2±9.3 vs. 113.5±11.8pg/mL, p<0.05) from pre-to post-diet. There were no change in fasting insulin or leptin concentrations, or in the control group for any hormone. In response to the high SFA meal challenges, there was greater postprandial PYY after the 7-day high PUFA diet (postprandial average of 140.8±10.4 vs. 150.6±8.8pg/mL for pre- to post-PUFA diet, respectively; p=0.05). Conversely, there were no significant differences in the postprandial response for ghrelin, insulin, leptin, or VAS measures from pre- to post-diet in either the PUFA or control groups (ns).

Conclusion A high PUFA diet consumed for seven days favorably altered physiological markers of hunger and satiety in the fasted and fed state. This study is the first to show that the types of fat consumed on a daily basis can alter long-term appetite hormone profiles.

Support or Funding Information

This study was funded by the California Walnut Commission.


4. Shay NF, Luo T, Miranda O, et al. 

Mice Fed High-fat Obesigenic Diets with Walnut Plus Other Whole Foods Demonstrate Metabolic Improvement and Changes in Gene Expression and Metabolomic Patterns.

FASEB J. 2016;30(Supp 1)428.3.


We tested the hypothesis that addition of PUFA-rich walnut to a high-fat Western diet (HF) would improve metabolism in male C57BL/6J mice. Further, various polyphenol-rich foods were added to walnut-containing HF diets to evaluate the potential for additional metabolic improvement. Groups of mice (n=8 each) were provided either a low-fat diet (LF, 10% kcal fat), high-fat diet (HF, 45% kcal fat), HF supplemented with walnut (W), or W diet supplemented with blueberry (W+BB), raspberry (W+RB), apple (W+AP), cranberry (W+CB), cherry (W+CH), broccoli (W+BR), olive oil (W+OO), soy protein (W+SP), or green tea (W+GT) for 9 weeks. In week 8, a glucose tolerance test was conducted: W-fed mice showed improved glucose control vs. HF-fed mice while W+RB- and W+AP-fed groups showed improvement vs. W-fed mice. After 9 weeks, LF-fed mice gained less weight than all other HF-fed groups. Histological analysis showed hepatic lipid in W-fed mice was not significantly different from LF-fed mice, while W+BR- and W+GT-fed mice had lowest lipid levels with the exception of the LF-fed mice. Patterns of expression of serum cytokines indicate a generalized reduction of inflammatory cytokines in all W-fed groups vs. HF-fed mice.

Gene expression in LF, HF, W, W+RB, W+CH, and W+GT groups was measured using a focused gene array (Qiagen). Analysis of relative mRNA levels revealed that walnut-fed mice and walnut plus another whole food had differentially regulated gene expression compared to HF-fed and W-fed mice. Significantly regulated mRNAs were associated with functions related to lipid, carbohydrate, and xenobiotic metabolism, antioxidant effect, and inflammation. The analysis indicates each diet produces a unique expression pattern.

Finally, a global metabolomic study compared LF-, HF-, W-, W+CH-, W+RB-, and W+GT-fed mice. Changes in metabolite levels related to energetics, inflammation and redox homeostasis were observed. Changes in carbohydrate metabolism were observed in W+CH-fed mice, with more subtle effects in W+RB-fed mice; these may reflect changes in increased energy demand. Changes in biomarkers in HF- vs. LF-fed mice pointed to increasing inflammation in HF mice; HF+W mice showed increases in omega-3 polyunsaturated fatty acids which may negatively regulate inflammation. W-fed mice supplemented with RB, CH or GT produced signs of declining inflammation. Finally, changes in biomarkers of redox homeostasis in HF+W+CH may suggest declining oxidative stress, which may also impact inflammation in the liver. Further studies assessing feces could identify microbiome-related changes associated with dietary supplementation. Finally, an assessment of how dietary supplementation might affect obesity-related disease models (e.g. NASH) may identify new therapeutic opportunities to target disease. We conclude that intake of walnut, a high PUFA-containing food, on its own or in combination with a polyphenol-rich food, was demonstrated to have significant effects on physiological parameters related to metabolic syndrome and changes in both hepatic gene expression and metabolite levels consistent with an improved metabolic state.

Support or Funding Information

California Walnut Commission

Edited by AlPater
Link to comment
Share on other sites

Healthy Dietary Patterns and Risk of Mortality and ESRD in CKD: A Meta-Analysis of Cohort Studies.

Kelly JT, Palmer SC, Wai SN, Ruospo M, Carrero JJ, Campbell KL, Strippoli GF.

Clin J Am Soc Nephrol. 2016 Dec 8. pii: CJN.06190616. [Epub ahead of print]

PMID: 27932391



Patients with CKD are advised to follow dietary recommendations that restrict individual nutrients. Emerging evidence indicates overall eating patterns may better predict clinical outcomes, however, current data on dietary patterns in kidney disease are limited.


This systematic review aimed to evaluate the association between dietary patterns and mortality or ESRD among adults with CKD. Medline, Embase, and reference lists were systematically searched up to November 24, 2015 by two independent review authors. Eligible studies were longitudinal cohort studies reporting the association of dietary patterns with mortality, cardiovascular events, or ESRD.


A total of seven studies involving 15,285 participants were included. Healthy dietary patterns were generally higher in fruit and vegetables, fish, legumes, cereals, whole grains, and fiber, and lower in red meat, salt, and refined sugars. In six studies, healthy dietary patterns were consistently associated with lower mortality (3983 events; adjusted relative risk, 0.73; 95% confidence interval, 0.63 to 0.83; risk difference of 46 fewer (29-63 fewer) events per 1000 people over 5 years). There was no statistically significant association between healthy dietary patterns and risk of ESRD (1027 events; adjusted relative risk, 1.04; 95% confidence interval, 0.68 to 1.40).


Healthy dietary patterns are associated with lower mortality in people with kidney disease. Interventions to support adherence to increased fruit and vegetable, fish, legume, whole grain, and fiber intake, and reduced red meat, sodium, and refined sugar intake could be effective tools to lower mortality in people with kidney disease.


Adult; Carbohydrates; Cohort Studies; Confidence Intervals; Dietary Fiber; Edible Grain; Fabaceae; Fruit; Humans; Kidney Failure, Chronic; Longitudinal Studies; Nutrition; Red Meat; Risk; Sodium; Vegetables; Whole Grains; chronic kidney disease; dialysis; diet; diet quality; dietary patterns; mortality


GH/IGF-I/insulin system in centenarians.

Vitale G, Barbieri M, Kamenetskaya M, Paolisso G.

Mech Ageing Dev. 2016 Dec 5. pii: S0047-6374(16)30233-0. doi: 10.1016/j.mad.2016.12.001. [Epub ahead of print] Review.

PMID: 27932301


The endocrine system plays a major role in the regulation of several biological activity and in the ageing process. Evolutionary conservation of GH/IGF-I/insulin pathway from worms to mice and similarities in this system between mice and humans raised expectations that downregulated activity of the GH/IGF-I/insulin pathway could be beneficial for the extension of human life span. Centenarians represent the best example of successful ageing having reached the very extremes of the human life span, escaping and delaying the occurrence of several fatal age-related diseases, such as cancer and cardiovascular diseases. This review describes the endocrine profile of centenarians concerning the GH/IGF-I/insulin system, focusing on the relevance of this pathway on the modulation of ageing and longevity.


IGF-I; ageing; centenarians; insulin; longevity


The Case for Low Blood Pressure Targets.

Flack JM, Nolasco C, Levy P.

Am J Hypertens. 2016 Aug 29. pii: hpw087. [Epub ahead of print] Review.

PMID: 27572960


The "totality" of hypertension clinical trial endpoint data has shown that the absolute benefit of pharmacological blood pressure (BP) lowering is directly related to the BP level and baseline cardiovascular risk, albeit with attenuation of the relative risk reduction per unit of BP lowering in patients with diabetes and chronic kidney disease. Absolute risk reductions with pharmacological treatment are greater with advancing age. Cardiovascular risk and mortality reductions attributable to pharmacological BP lowering have been demonstrated for progressively lower BP levels extending well below the conventional BP threshold (140/90mm Hg) for hypertension. Hypertension endpoint trials have shifted from determining the relative clinical benefits of various antihypertensive drugs to exploring whether lower than conventional BP targets in persons with BP levels spanning the prehypertensive to much higher BP strata confer clinical benefit. The more recent of these trials were "relatively" agnostic to the drugs used for BP lowering although several trials provided, but did not mandate the use of, specific agents. Pharmacological treatment benefit has been demonstrated at pretreatment BP levels even lower than the intensive SPRINT BP target (<120mm Hg) and a growing body of evidence suggests that substantial risk reduction can be achieved by maintaining a normal BP over time (rather than waiting for BP to exceed 140/90mm Hg before treating). Thus there is a compelling rationale to lower the BP threshold not just for a therapeutic goal but also for the initiation of pharmacological intervention.


SPRINT trial; blood pressure; hypertension; treatment recommendations.


Is it Time to Find a Role for Uric Acid Levels in the Prevention and Management of Hypertension

Joe X. Xie and Salim S. Hayek

Am J Hypertens (2017) 30 (1): 16-18. doi: 10.1093/ajh/hpw139 First published online: November 10, 2016



Hypertension is among the most common medical conditions, affecting one-third of adults in the United States.1 Despite being one of the most extensively studied diseases, there have been no recent breakthroughs in the treatment of essential hypertension, and its pathophysiology—a complex interplay of genetic, environmental and behavioral factors—remains unclear.2 With a continuing need to understand the underlying biology of hypertension and identify novel therapeutic targets, there has been a resurgent interest in serum uric acid as a non-traditional biomarker thought to be involved in the pathogenesis of hypertension.3

Although traditionally viewed as the pathogenic factor in the development of gout, uric acid has been postulated as a potential mediator of hypertension for over a century.4–6 The association between elevated uric acid levels and hypertension has now been described in several multinational cohorts and is clearly independent of ethnicity, race, or gender.7 , 8 Moreover, large observational studies have consistently found that elevated uric acid levels predicted incident hypertension, suggesting uric acid may be a potential risk factor for hypertension.6 , 9 However, defining a clinical role for uric acid in hypertension has posed a challenge, partly due to the significant heterogeneity across studies, the inability to adjust for potential confounders such as dietary habits, alcohol use, and renal function, and the paucity of evidence indicating that a change in uric acid levels modulates outcomes.6 , 9

In this issue of the American Journal of Hypertension, Sung and colleagues present the largest registry-based study to date characterizing the association between serum uric acid levels with the incidence of hypertension in apparently healthy individuals participating in an employment-based health screening program in South Korea.10 Serum uric acid levels were measured in 96,606 Koreans (mean age 37 ± 7, 43% women) without a prior …


Baseline and Change in Uric Acid Concentration Over Time Are Associated With Incident Hypertension in Large Korean Cohort.

Sung KC, Byrne CD, Ryu S, Lee JY, Lee SH, Kim JY, Kim SH, Wild SH, Guallar E.

Am J Hypertens. 2016 Aug 24. pii: hpw091. [Epub ahead of print]

PMID: 27557861



It is uncertain whether high-baseline uric acid (UA) or change in UA concentration over time is related to development of incident hypertension. To investigate relationships between: (i) baseline serum UA concentration and (ii) change in UA concentration and incident hypertension.


About 96,606 Korean individuals (with follow-up UA data available for 56,085 people) participating in a health check program was undertaken. Cox regression models were used to estimate adjusted hazard ratios (aHRs) and 95% confidence intervals (CIs) for incident hypertension according to UA quartiles regarding the lowest UA quartile as the reference, and also according to change in UA concentration comparing individuals with an increase in UA to those with a decrease in UA concentration over time.


Total follow up time was 8 years (median follow-up 3.3 years; interquartile range, 1.9-5.1). About 10,405 cases of incident hypertension occurred. In the fully adjusted regression models, the HRs (95% CI) for incident hypertension comparing the highest vs. the lowest quartiles of UA were 1.29 (1.19-1.38) in men and 1.24 (1.09-1.42) in women, with statistically significant P for trend for both gender. Additionally, stable or increasing UA concentration over time was associated with increased risk of incident hypertension, particularly in participants with baseline UA concentration ≥median (aHRs 1.14; 95% CI (1.03-1.26) and 1.18; 95% CI (0.98-1.40) in men and women, respectively).


High initial UA concentration and increases in UA concentration over time should be considered independent risk factors for hypertension.


blood pressure; hypertension; risk; risk factors; uric acid.

Edited by AlPater
Link to comment
Share on other sites

The role of dietary carbohydrates in organismal aging.

Lee D, Son HG, Jung Y, Lee SV.

Cell Mol Life Sci. 2016 Dec 10. [Epub ahead of print] Review.

PMID: 27942749


Carbohydrates are essential nutrients that are used as a primary source of energy. Carbohydrate utilization should be properly controlled, as abnormal regulation of carbohydrate metabolism is associated with diseases, such as diabetes, cardiovascular diseases, and stroke. These metabolic syndromes have become a serious problem in developed countries, and there is an increased need for research examining the influence of carbohydrates on animal physiology. Diets enriched in glucose, a major carbohydrate, are also associated with accelerated aging in several model organisms, including yeast and Caenorhabditis elegans (C. elegans). Genetic factors that mediate the effects of high glucose diets on aging have been identified during the last decade, mostly through the use of C. elegans. In this review, we describe studies that determine the effects of carbohydrate-enriched diets on aging by focusing on the mechanisms through which evolutionarily conserved pathways mediate the lifespan-altering effects of glucose in C. elegans. These include the insulin/insulin-like growth factor-1, sterol-regulatory element-binding protein, and AMP-activated protein kinase signaling pathways. We also discuss the effects of various carbohydrates and carbohydrate-derived metabolites on aging in model organisms and cultured mammalian cells. Finally, we discuss how dietary carbohydrates influence health and aging in humans.


Dihydroxyacetone phosphate; FOXO; Longevity; MDT-15; Reactive oxygen species; Sugar


[The below paper is not pdf-availed.]

Exaggerated exercise blood pressure response in middle-aged men as a predictor of future blood pressure: a 10-year follow-up.

Ito K, Iwane M, Miyai N, Uchikawa Y, Mugitani K, Mohara O, Shiba M, Arita M.

Clin Exp Hypertens. 2016 Dec 12:1-5. [Epub ahead of print]

PMID: 27936961



The prognostic value of an exaggerated exercise systolic blood pressure response (EESBPR) remains controversial. This study was designed to assess whether an EESBPR is associated with the predictor of future blood pressure.


From an initial population of 1,534 male-subjects with normal BP or no medication who underwent ergometric exercise, 733 subjects (mean age: 41 years old) at baseline to follow-up BP after an average of 10 years were selected. A 12-min exercise tolerance test with three phases of estimated load from predictive maximum oxygen intake was performed at baseline, and exercise BP was measured.


Exercise BP response was classified by three group: Low group (G) (exercise SBP < 180 mmHg), Middle G (exercise BP:180-199 mmHg), High G (exercise BP:200 mmHg ≦). BP after 10 years in Low G was 123 ± 12/79 ± 7 mmHg, in Middle G:127 ± 13/81 ± 8 mmHg, in High G :134 ± 15/84 ± 10 mmHg. Compared with in Low G, BP after 10 years in High G significantly increased (p < 0.05). Multiple regression analysis was carried out to clarify the relationship of exercise SBP at baseline to BP after 10 years. In multivariate-adjusted models, the relationship of SBP at follow-up was stronger to exercise SBP (β = 0.271, P < 0.001) than to resting SBP (β = 0.148, P < 0.001). Maximum oxygen intake (β = -0.193, P = 0.003) and resting SBP correlated with SBP after 10 years.


In middle-aged men, exercise SBP would be a stronger predictor of future SBP, DBP rather than BP at rest. In optimal of classification of BP (SBP < 120 mmHg), exercise BP response was clearly associated with BP after 10 years.


Exaggerated BP elevation; exercise; exercise SBP; future hypertension; hypertension


Opposing effects of sodium intake on uric acid and blood pressure and their causal implication.

Juraschek SP, Choi HK, Tang O, Appel LJ, Miller ER 3rd.

J Am Soc Hypertens. 2016 Dec;10(12):939-946.e2. doi: 10.1016/j.jash.2016.10.012.

PMID: 27938853


Reducing uric acid is hypothesized to lower blood pressure, although evidence is inconsistent. In this ancillary of the DASH-Sodium trial, we examined whether sodium-induced changes in serum uric acid (SUA) were associated with changes in blood pressure. One hundred and three adults with prestage or stage 1 hypertension were randomly assigned to receive either the DASH diet or a control diet (typical of the average American diet) and were fed each of the three sodium levels (low, medium, and high) for 30 days in random order. Body weight was kept constant. SUA was measured at baseline and following each feeding period. Participants were 55% women and 75% black. Mean age was 52 (SD, 10) years, and mean SUA at baseline was 5.0 (SD, 1.3) mg/dL. Increasing sodium intake from low to high reduced SUA (-0.4 mg/dL; P < .001) but increased systolic (4.3 mm Hg; P < .001) and diastolic blood pressure (2.3 mm Hg; P < .001). Furthermore, changes in SUA were independent of changes in systolic (P = .15) and diastolic (P = .63) blood pressure, regardless of baseline blood pressure, baseline SUA, and randomized diet, as well as sodium sensitivity. Although both SUA and blood pressure were influenced by sodium, a common environmental factor, their effects were in opposite directions and were unrelated to each other. These findings do not support a consistent causal relationship between SUA and BP.


Hypertension; sodium; trial; uric acid


Sleep duration and total cancer mortality: a meta-analysis of prospective studies.

Ma QQ, Yao Q, Lin L, Chen GC, Yu JB.

Sleep Med. 2016 Nov - Dec;27-28:39-44. doi: 10.1016/j.sleep.2016.06.036.

PMID: 27938917



Epidemiological evidence suggests a possible association between sleep duration and cancer-related mortality, but the reported findings are inconsistent. We conducted a meta-analysis of prospective studies to evaluate the relationships between sleep duration and cancer mortality.


Potentially relevant studies were identified by searching PubMed and Embase databases in addition to manual searches of references of retrieved full publications. The summary relative risks (RRs) with 95% confidence intervals (CIs) were computed using a random-effect model. The meta-regression analyses were performed to explore any potential effect modifier.


A total of 17 reports from 11 independent prospective studies were included in this meta-analysis. When comparing with reasonable sleep duration (mostly defined as 7 or 7-8 h), the summary RR for long sleep duration (mostly defined as ≥9 or ≥10 h) and short sleep duration (mostly defined as ≤6 or ≤5 h) was 1.11 (95% CI = 1.05-1.18) and 1.05 (95% CI = 0.99-1.11), respectively, with little evidence of heterogeneity. There was evidence of publication bias for the association of long sleep duration with cancer mortality, and the summary RR was slightly attenuated to be 1.10 (95% CI = 1.02-1.18) after using a statistical method to correct for the bias.


This meta-analysis of prospective studies suggests that long, not short sleep duration is associated with significantly increased risk of total cancer mortality.


Cancer; Meta-analysis; Mortality; Sleep duration


Beverage Intake and Metabolic Syndrome Risk Over 14 Years: The Study of Women's Health Across the Nation.

Appelhans BM, Baylin A, Huang MH, Li H, Janssen I, Kazlauskaite R, Avery EF, Kravitz HM.

J Acad Nutr Diet. 2016 Dec 6. pii: S2212-2672(16)31293-X. doi: 10.1016/j.jand.2016.10.011. [Epub ahead of print]

PMID: 27938940



Alcohol and energy-dense beverages consumption have been implicated in cardiometabolic disease, albeit inconsistently.


This study tested prospective associations between intakes of alcohol, energy-dense beverages, and low-calorie beverages and cardiometabolic risk in midlife women.


The Study of Women's Health Across the Nation is a 14-year, multisite prospective cohort study (1996-2011). Beverage intake and cardiometabolic risk factors that define the metabolic syndrome (hypertension, abdominal obesity, impaired fasting glucose, low high-density lipoprotein cholesterol level, and hypertriglyceridemia) were assessed throughout follow-up.


Participants (N=1,448) were African American, Chinese, Japanese, and non-Hispanic white midlife women from six US cities.


The primary outcomes were incident metabolic syndrome and the individual metabolic syndrome components.


Generalized linear mixed models tested associations between intakes within each beverage category and odds of meeting criteria for metabolic syndrome and each of the metabolic syndrome components.


Energy-dense beverage consumption was highest among African-American women and lowest among women with college degrees. Non-Hispanic white women consumed the largest quantities of alcohol. Independent of energy intake and potential confounders, each additional 355 mL energy-dense beverages consumed per day was associated with higher odds of developing metabolic syndrome in each successive year of follow-up (odds ratio [OR] 1.05, 95% CI 1.02 to 1.08). Greater energy-dense beverage intake was associated with more rapidly increasing odds of developing hypertension (OR 1.06, 95% CI 1.02 to 1.11) and abdominal obesity (OR 1.10, 95% CI 1.03 to 1.16) over time, but not with the other metabolic syndrome components. Intakes of alcohol and low-calorie coffees, teas, and diet cola were not associated with metabolic syndrome risk.


Over 14 years of follow-up, energy-dense nonalcoholic beverage consumption was associated with incident metabolic syndrome in midlife women. The observed differences in intakes by ethnicity/race and education suggest that consumption of these beverages may contribute to disparities in risk factors for diabetes and cardiovascular disease.


Alcoholic beverages; Beverages; Hypertension; Metabolic syndrome


The Association Between Protein Intake by Source and Osteoporotic Fracture in Older Men: A Prospective Cohort Study.

Langsetmo L, Shikany JM, Cawthon PM, Cauley JA, Taylor BC, Vo TN, Bauer DC, Orwoll ES, Schousboe JT, Ensrud KE; Osteoporotic Fractures in Men (MrOS) Research Group..

J Bone Miner Res. 2016 Dec 12. doi: 10.1002/jbmr.3058. [Epub ahead of print]

PMID: 27943394


Dietary protein is a potentially modifiable risk factor for fracture. Our objectives were to assess the association of protein intake with incident fracture among older men and whether these associations varied by protein source or by skeletal site. We studied a longitudinal cohort of 5875 men (mean age 73.6, SD = 5.9 years) in the Osteoporotic Fractures in Men (MrOS) study. At baseline, protein intake was assessed as percent of total energy intake (TEI) with mean intake from all sources = 16.1%TEI. Incident clinical fractures were confirmed by physician review of medical records. There were 612 major osteoporotic fractures, 806 low-trauma fractures, 270 hip fractures, 193 spine fractures, and 919 non-hip non-spine fractures during 15 years of follow-up. We used Cox proportional hazards models with age, race, height, clinical site, TEI, physical activity, marital status, osteoporosis, gastrointestinal surgery, smoking, oral corticosteroids use, alcohol consumption, and calcium and vitamin D supplements as covariates to compute hazard ratios (HR) with 95% confidence intervals (CI), all expressed per unit (SD = 2.9%TEI) increase. Higher protein intake was associated with a decreased risk of major osteoporotic fracture (HR = 0.92 [95% CI: 0.84, 1.00]) with a similar association found for low-trauma fracture. The association between protein and fracture varied by protein source; e.g. increased dairy protein and non-dairy animal protein were associated with a decreased risk of hip fracture (HR = 0.80 [95% CI: 0.65, 0.98] and HR = 0.84 [95% CI: 0.72, 0.97], respectively), while plant-source protein was not (HR = 0.99 [95% CI: 0.78, 1.24]). The association between protein and fracture varied by fracture site; total protein was associated with a decreased risk of hip fracture (HR = 0.84 [95% CI: 0.73, 0.95]), but not clinical spine fracture (HR = 1.06 [95% CI: 0.92, 1.22]). In conclusion, those with high protein intake (particularly high animal protein intake) as a percentage of TEI have a lower risk of major osteoporotic fracture. 


epidemiology; fracture prevention; metabolism; nutrition; osteoporosis


A comprehensive meta-analysis on dietary flavonoid and lignan intake and cancer risk: level of evidence and limitations.

Grosso G, Godos J, Lamuela-Raventos R, Ray S, Micek A, Pajak A, Sciacca S, D'Orazio N, Rio DD, Galvano F.

Mol Nutr Food Res. 2016 Dec 12. doi: 10.1002/mnfr.201600930. [Epub ahead of print]

PMID: 27943649



To summarize available evidence on the association between dietary flavonoid as well as lignan intake and cancer risk in observational studies.


A systematic search on electronic databases of all English language case-control and prospective studies published up to June 2016 was performed. Risk ratios (RRs) and 95% confidence intervals (CIs) were calculated by random-effects model separately by study design. Heterogeneity and publication bias were tested. Out of the 143 studies included, meta-analyses of prospective studies showed isoflavones significantly associated with decreased risk of lung and stomach cancers and nearly significant breast and colorectal cancers; total flavonoids showed non-significant decreased risk of breast cancer. Meta-analyses of case-control studies showed: total and/or individual classes of flavonoids associated with upper aero-digestive tract, colorectal, breast, and lung cancers; isoflavones with ovarian, breast, and colorectal cancers, endometrial and lung cancers.


Most evidence reported in previous meta-analyses was driven by case-control studies. Overall results may be promising but are inconclusive. Further prospective cohorts assessing dietary polyphenol exposure and studies using other methods to evaluate exposure (i.e. markers of consumption, metabolism, excretion) are needed to confirm and determine consumption levels required to achieve health benefits. 


Dietary polyphenols; cancer; flavonoids; lignans; meta-analysis; observational studies

Edited by AlPater
Link to comment
Share on other sites

JAMA December 13, 2016, Vol 316, No. 22, Pages 2325-2440




Understanding County-Level, Cause-Specific Mortality -- The Great Value—and Limitations—of Small Area Data

Cheryl R. Clark, MD, ScD; David R. Williams, PhD, MPH

JAMA. 2016;316(22):2363-2365. doi:10.1001/jama.2016.12818


US County-Level Trends in Mortality Rates for Major Causes of Death, 1980-2014

Laura Dwyer-Lindgren, MPH; Amelia Bertozzi-Villa, MPH; Rebecca W. Stubbs, BA; et al.

Abstract Full Text

free access has active quiz has multimedia  JAMA. 2016;316(22):2385-2401. doi:10.1001/jama.2016.13645

This population epidemiology study estimates county-level patterns in mortality rates for 21 major causes of death in the United States from 1980 through 2014.



Author Interview: Editor's Audio Summary by Howard Bauchner, MD, Editor in Chief of JAMA, the Journal of the American Medical Association, for the December 13, 2016 issue


Adapting to Artificial Intelligence -- Radiologists and Pathologists as Information Specialists

Saurabh Jha, MBBS, MRCS, MS; Eric J. Topol, MD

JAMA. 2016;316(22):2353-2354. doi:10.1001/jama.2016.17438

This Viewpoint discusses advances in artificial intelligence (AI) that challenge the traditional roles of diagnostic radiologists and pathologists and proposes that the 2 roles be merged into that of an information specialist tasked with managing and interpreting information extracted by AI in clinical contexts.



Translating Artificial Intelligence Into Clinical Care

Andrew L. Beam, PhD; Isaac S. Kohane, MD, PhD


Artificial Intelligence With Deep Learning Technology Looks Into Diabetic Retinopathy Screening

Tien Yin Wong, MD, PhD; Neil M. Bressler, MD

Abstract Full Text

JAMA. 2016;316(22):2366-2367. doi:10.1001/jama.2016.17563


Development and Validation of a Deep Learning Algorithm for Detection of Diabetic Retinopathy in Retinal Fundus Photographs

Varun Gulshan, PhD; Lily Peng, MD, PhD; Marc Coram, PhD; et al.

Abstract Full Text

JAMA. 2016;316(22):2402-2410. doi:10.1001/jama.2016.17216

This study assesses the sensitivity and specificity of an algorithm based on deep machine learning for automated detection of diabetic retinopathy and diabetic macular edema in retinal fundus photographs.


[Looks like it's only an old abstract.]

Impact of androgen deprivation therapy (ADT) on quality of life (QL), cognitive and physical function of patients with non-metastatic prostate cancer (PC).

Joly F, Alibhai S, Galica J, Soban F, Park A, Yi QL, Wagner L, Tannock I.

J Clin Oncol. 2005 Jun;23(16_suppl):4627.

PMID: 27947005


4627 Background: Many patients receive ADT for PC, often for long periods. Here we evaluate the impact of ADT on QL, cognitive and physical function of such patients.


From October 2003 to June 2004, 57 patients with non-metastatic PC (treated ≥3 months with ADT) and 51 healthy controls (matched by age) were recruited to a case-control study. QL was assessed by the FACT-G with the subscale for fatigue (FACT-F) and by the Patient-Oriented Prostate Scale (PORPUS). Cognitive function was evaluated by the High Sensitivity Cognitive Screen (HSCS) and by a prototype FACT subscale (FACT-COG). Physical and general function were measured by the 6 Minute Walk Test (6MWT), upper extremity grip strength, the Time Up and Go test, the Barthel Index of basic activities of daily living (ADL), the Lawton and Brody Instrumental ADL Scale and the Geriatric Depression Scale (GDS).


Mean age (72 years), educational level and medical history (apart from PC) were similar for patients and controls. Patients received ADT for rising PSA (47%) or as adjuvant treatment (53%). Mean duration of ADT was 2.4yr (SD: 1.7yr). Patients had lower mean Hb (135 vs 149 mg/l, p<0.0001) and Barthel Index score (p=0.03) than controls. Scores for the FACT-G and physical function did not differ significantly between patients and controls. Moderate or severe impairment of cognitive function evaluated by HSCS was similar for patients (23%) and controls (35%; p=0.2). Self-reported cognitive deficit and impact on QL measured by FACT-COG were also similar (p=0.8). The PORPUS summary score was worse for patients (mean, 68 versus 83, p<0.0001) with more moderate/severe symptoms (lost energy 36% vs 16%, p=0.03; poor bladder control 20% vs 4%, p=0.02, loss of sexual function/interest 95/89% vs 33/30%, p<0.0001). Patients had more severe fatigue (FACT-F score < 35, 14% vs 2%, p=0.03). There was a strong correlation (p<0.001) between fatigue, the Barthel index and all the symptoms of the PORPUS, but no association with the Hb level.


Patients treated with ADT experience more symptoms and severe fatigue than controls, but this study did not detect an effect on physical or cognitive functions. No significant financial relationships to disclose.


[Looks like it's only an old abstract.]

Changes in body composition after androgen deprivation therapy (ADT) in prostate cancer patients. Relationship with disease outcome.

Berruti A, Vana F, Tucci M, Mosca A, Russo L, Gorzegno G, Saini A, Perotti P, Tampellini M, Dogliotti L.

J Clin Oncol. 2008 May 20;26(15_suppl):16007.

PMID: 27947292


16007 Background: ADT increases fat body mass (FBM), decreases lean body mass (LBM); and decreases bone mineral density (BMD) in men with prostate carcinoma. No data are actually available regarding the relationship between treatment-related changes in body composition and patient outcome.


Using dual energy x-ray absorptiometry (DEXA) we determined BMD, FBM and LBM at baseline and after 1 and 2 years in 50 consecutive patients with high risk non metastatic prostate cancer submitted to luteinizing hormone relasing hormone analogues (LHRH-A), recruited between 1997 and 2000.


patient median age (range) was 71 yrs (44-83). Nineteen patients had AUA stage B disease, 27 patients stage C and 4 patients stage D1. BMD (gm/cm2) at lumbar spine [mean, 95% Confidence Interval (CI)] was 0.943 (0.874-1.013) at baseline, 0.933 (0.866-1.00) after 1 year, and 0.927 (0.863-0.991) after 2 years (p<0.03); the corresponding LBM data (gm) were 50216 (48068-52364), 49553 (47314-51791), and 49377 (47247-51507), respectively (p<0.03); while FBM data (gm) were 19463 (17143-21783), 21028 (18964-23093), and 21680 (19427-23932), respectively (p=0.000). After a median follow-up of 76 months 11 patients (22.0%) underwent adverse skeletal related events (SREs), 20 (40.0%) had PSA progression, and 22 (44.0%) died. Changes in BMD after 1 year failed to show any relationship with time to SRE onset (TTSRE), PSA recurrence and death. LBM decrease below the median change after 1 year did not correlate with either TTSRE or survival while was predictive for lower risk of PSA recurrence just failing to attain the statistical significance [Hazard Ratio (HR) 0.40, 95% CI 0.15-1.08, p=0.07]. FBM increase above the median change after 1 year was a significant predictor of higher risk of SREs [hr 5.40, 95% CI: 1.09-26.8, p<0.04], higher risk of death [hr 3.45, 95% CI: 1.29-9.23, p<0.02] and lower (not significant) risk of PSA progression [hr 2.07, 95% CI: 0.80-5.40, p=0.13].


this explorative analysis suggest that changes in body composition, FBM in particular, assessed by DEXA, may provide predictive information of outcome in prostate cancer patients given ADT. More mature data will be provided at the meeting. 


[Looks like it's only an old abstract.]

Breast cancer in octogenarians.

Evron E, Goldberg H, Kuzmin A, Gutman H.

J Clin Oncol. 2005 Jun;23(16_suppl):833.

PMID: 27945166


833 Background: The number of elderly cancer patients in our society increase rapidly. However, guidelines for their care are often lacking and treatments are influenced by various considerations and are usually less aggressive. Information regarding the natural course of the disease and the outcome of therapy in this population is of uppermost importance.


We report of 135 women who were diagnosed with localized breast cancer at age 80 or older, and were treated in 2 tertiary centers in Israel between 1991 and 2001. The medical records of all patients were retrospectively reviewed. 135 patients that had localized disease and were deemed medically fit for surgery were included. Follow up information was obtained from the medical charts and through phone calls to patients or their families. Death dates were obtained from the Israeli main censor population data resources.


The median age at diagnosis was 83. Local treatment consisted of modified radical mastectomy or lumpectomy and axillary dissection followed by radiation therapy ("standard local treatment"= SLT) in 51 (38%) patients, and less than standard treatment in 84 (62%) patients. 88% of the tumors were T1 or T2, 75% were hormone-receptors positive. 90 (67%) tumors were Infiltrating Duct Carcinoma (IDC), 80/90 (89%) were grade II or III. Lymph nodes metastases were detected in 31/72 (43%) patients that underwent AXLND. Adjuvant treatment consisted of Tamoxifen in 110 patients, radiation therapy in 36 patients and chemotherapy in 3 patients only. At a median interval of 70 months, 18 (13%) patients had recurrent disease. 11/18 had local recurrence only and 7 had systemic disease. By that time 34 (25%) patients had died. 10 of breast cancer, 17 of other causes and for 7 the cause of death could not be disclosed. There was a trend toward higher disease specific mortality in patients that got less than SLT (p=0.09). The median overall survival has not been reached.


The treatment of octogenarians with breast cancer should be individualized. Most of the patients lived more than 6 years after the diagnosis, long period to enjoy the benefits of appropriate therapy. The data do not support the assumption that breast tumors of very old patients are more indolent.


[Looks like it's only an old abstract.]

Single-institution experience of colorectal cancer in the very elderly population (80 years and over).

Soares HP, Cusnir M, Nascimento F, Balducci L, Weinberg GB.

J Clin Oncol. 2008 May 20;26(15_suppl):20542.

PMID: 27949755


20542 Background: More than 70% of newly diagnosed cases of colon cancer are in individuals older than 70 years. However, limited data are available about treatment options in elderly patients, particularly in those over 80 years of age. As life expectancy increases and performance status is maintained, this subgroup of patients should play an important role in the design of colon cancer trials.


To assess survival of very old patients based on the therapy received, we conducted a retrospective analysis of all consecutive colorectal patients, aged 80 and over, who were seen in one single institution (Mount Sinai Medical Center). Using our cancer registry database, we extracted data regarding age at diagnosis, site of tumor and pathology, stage, treatments and survival.


Over the last 26 years, 1390 patients (mean age at diagnosis was 85 years) have been seen in our institution. 73.8 % of the patients had surgery as the only treatment; of these, 248 (24%) subjects had only local tumor excision. Data on AJCC staging was available for 772 patients: 3.5% had in situ cancer, 34.3% had stage I disease, 30.1% stage II, 19.5% stage III and 12.6% stage IV. 98 patients received chemotherapy as part of their treatment. Stage II patients treated with chemotherapy (n=13) had a median overall survival of 75 months compared with 46 months for patients who had exclusively surgery (n=207). In stage III patients (n=139), chemotherapy significantly increased survival [25 versus 49 months (p=0.03)]. In stage IV (n= 64), there was no difference between the groups (8 versus 9 months, p=0.5).


Despite the lack of recommendations from the US Preventive Services Task Force, colon cancer screening plays an important role in cancer prevention of elderly patients; as we observed that many tumors were detected in early stages. Elderly colon cancer patients tolerate and tend to benefit from chemotherapy, especially in early stages. Few randomized trials are designed exclusively for the elderly, and most trials do not allow the inclusion of elderly, despite good performance status. Cancer studies design to determine effective screening measures, treatment, and outcomes for this expanding age group are needed. 


[Looks like it's only an old abstract.]

Dietary fat reduction in postmenopausal women with primary breast cancer: Phase III Women's Intervention Nutrition Study (WINS).

Chlebowski RT, Blackburn GL, Elashoff RE, Thomson C, Goodman MT, Shapiro A, Giuliano AE, Karanja N, Hoy MK, Nixon DW; WINS Investigators..

J Clin Oncol. 2005 Jun;23(16_suppl):10.

PMID: 27943760


10 Background: Despite preclinical and observational studies suggesting benefit, dietary fat influence on breast cancer outcomes has been controversial.


We conducted a randomized trial to test whether an intensive dietary intervention designed to reduce dietary fat intake was effective in influencing relapse-free survival in postmenopausal women with primary breast cancer. A total of 2,437 women with early stage resected breast cancer, 48-79 years old, were randomized within 365 days from surgery in a 40:60 ratio to dietary intervention or control groups at 37 U.S. sites. All received standard breast cancer management: mastectomy or lumpectomy plus radiation; tamoxifen for ER positive, protocol-defined chemotherapy for ER negative and optional chemotherapy for ER positive cases. The dietary intervention included eight bi-weekly individual counseling sessions conducted by centrally trained nutritionists who provided ongoing contacts throughout.


Patient characteristics and recurrence risk factors were balanced. Dietary fat intake reduction was greater in the dietary group (fat gram intake/day at 12 months, 33.3 ± 16.7, mean ± standard deviation (SD) versus 51.3 ± 24.4 in controls, respectively, p<0.001). After 60 months median follow-up, the 277 reported relapse events are outlined below by treatment group and receptor status.


Life-style intervention resulting in dietary fat intake reduction may improve the relapse-free survival of postmenopausal breast cancer patients. [Figure: see text] No significant financial relationships to disclose.


[Looks like it's only an old abstract.]

Influence of aetiological factors for breast cancer on outcome after diagnosis.

Barnett GC, Shah M, Redman K, Goodward S, Easton DF, Ponder BA, Pharoah PD.

J Clin Oncol. 2005 Jun;23(16_suppl):762.

PMID: 27945716


762 Background: Risk factors which influence the incidence of breast cancer may also affect survival after diagnosis.


We studied the prognostic effects of reproductive history, height, weight, body mass index (BMI), smoking history, alcohol intake and prior use of the combined oral contraceptive pill (COC) and hormonal replacement therapy (HRT) in 4816 women who had taken part in the population-based SEARCH-breast cancer study. Hazard ratios (HRs) for death (all causes) were estimated using Cox proportional hazards survival model for each risk factor. Multivariate analysis was performed adjusting for grade, stage and age at diagnosis.


Median follow up was 5.27 years. On univariate analysis, reproductive history, age at menarche and menopause, menopause status at diagnosis, smoking history, prior use of HRT and COC were not seen to influence prognosis. Tumour grade and stage had a highly significant impact on overall survival (p<0.0001). Increasing height and decreasing weight were associated with a decreased risk of death. BMI was associated with a HR for death of 1.03 (95%CI 1.02-1.05; p=0.0002) per unit increase. Women who were in the highest quartile of BMI were 1.6 times more likely to die (95%CI 1.2-2.0) than those in the lowest quartile. Improved prognosis was seen with increasing current alcohol consumption. The HR for death was 0.98 per unit alcohol consumed per week (95%CI 0.96-0.99; p=0.0043); intake of 14 or more units was associated with a HR of 0.69 (95%CI 0.49-0.96). Previous alcohol intake did not significantly affect outcome. The effects of BMI and current alcohol consumption were not attenuated after adjusting for grade and stage. However, the effect of alcohol intake was marginally significant at the 5% level after adjusting for multiple comparisons.


Our finding that an increase in BMI is associated with a negative impact on survival supports previously published data. The apparent benefit of alcohol intake has not been described before. Other studies have either shown no effect or poorer outcome in heavy drinkers. Our findings are preliminary and may simply be due to chance or confounding; or may reflect a true biological effect. Further work is needed to clarify these alternatives.


[Looks like it's only an old abstract.]

Moderate alcohol consumption and breast cancer risk.

Chen WY, Willett WC, Rosner B, Coldtiz GA.

J Clin Oncol. 2005 Jun;23(16_suppl):515.

PMID: 27946384


515 Background: Higher levels of alcohol consumption have been associated with breast cancer risk, but studies have been inconclusive on effect of lower levels.


We examined the relationship between alcohol use and breast cancer risk within the Nurses' Health Study, a prospective cohort of 121,700 registered nurses aged 30-55 in 1976 who update information on cancer risk factors and outcomes through biennial questionnaires. For this analysis, follow-up began in 1980 when the dietary questionnaire was first administered and continued through 2002. Alcohol use was measured by food frequency questionnaires assessing the average frequency of consumption of alcoholic beverages over the past year. Beer, wine, and liquor were assessed separately and updated in 1984, 1986, 1990, 1994, and 1998. Average daily alcohol consumption in grams per day was calculated by multiplying the number of drinks by the average alcohol content (12.8 grams of alcohol per 12 oz serving of beer, 11.0 per 4 oz serving of wine and 14.0 per serving of spirits). Proportional hazards models controlled for age, body mass index, parity, age at first birth, type and duration of postmenopausal hormone use, family history of breast cancer, benign breast disease, type of menopause, and ages at menarche and menopause.


Invasive breast cancer was diagnosed among 937 premenopausal and 4746 postmenopausal women with dietary data during the follow-up period. The association between alcohol use and breast cancer risk was seen only among women with postmenopausal breast cancer (RR (95% CI): 1.06 (0.98-1.13) for 0-5 gms/day, 1.14 (1.03-1.26) for 5-9.9 grams/day, 1.15 (1.05-1.26) for 10-19.9 grams/day and 1.41 (1.27-1.57) >=20 gms/day; p for trend < 0.0001), but not premenopausal breast cancer. Although the magnitude of risk was small, there was a statistically significant increased risk of breast cancer at levels of alcohol < 10 gms/day. The elevated risk was also seen mainly among women who developed ER+/PR+ cancers and was not modified by body mass index, postmenopausal hormone use (estrogen alone or combination estrogen/progesterone), or type of alcohol beverage.


Even modest levels of alcohol consumption may be associated with breast cancer risk, especially ER+/PR+ cancers.


[Looks like it's only an old abstract.]

Caffeine consumption and risk of breast cancer in a large prospective cohort of women.

Ishitani K, Lin J, Manson JE, Buring JE, Zhang SM.

J Clin Oncol. 2008 May 20;26(15_suppl):11060.

PMID: 27947716


11060 Background: Given the high prevalence of consumption of caffeinated beverages and foods, an association between caffeine intake and risk of breast cancer would be of great public health importance. However, prospective data relating consumption of coffee and other caffeinated beverages and foods to breast cancer risk are limited. We evaluated these associations among women enrolled in a large completed randomized trial of aspirin and vitamin E for chronic disease prevention.


Detailed dietary information was obtained by the food frequency questionnaire at baseline (1992-1995) among 38,453 women who were aged 45 years or older and free of cancer and cardiovascular disease. During an average of 10 years of follow-up, we identified 1188 invasive breast cancer cases. Cox proportional hazards regression models were used to calculate the relative risks (RRs) and 95% confidence intervals (CIs). All statistical tests were two- sided.


Consumption of caffeine was not statistically significantly associated with overall risk of breast cancer. The multivariable RRs of breast cancer were 1.02 (95% CI = 0.84 to 1.23) for caffeine (top vs. bottom quintile. The associations did not differ significantly by menopausal status, postmenopausal hormone use, or body mass index. However, among women with benign breast disease, a significant positive association with breast cancer risk was observed for the highest quintile of caffeine (multivariable RR = 1.32; 95% CI = 1.00 to 1.76) and for the highest category of coffee (≥4 cups/day) (multivariable RR = 1.35; 95% CI = 1.01 to 1.81). In addition, caffeine consumption was significantly positively associated with risk of developing ER-PR- breast cancer (multivariable RR = 1.68; 95% CI = 1.01 to 2.81) and breast tumors of >2 cm in size (multivariable RR = 1.79; 95% CI = 1.18 to 2.73).


Data from this large prospective cohort show no overall association between consumption of caffeine and caffeinated beverages and foods and risk of breast cancer. The possibility of an increased risk among women with benign breast disease or for tumors that are ER-PR- or greater than 2 cm in size warrants further study.


Prospective study of breast cancer in relation to coffee, tea and caffeine in Sweden.

Oh JK, Sandin S, Ström P, Löf M, Adami HO, Weiderpass E.

Int J Cancer. 2015 Oct 15;137(8):1979-89. doi: 10.1002/ijc.29569.

PMID: 25885188 Free Article


Studies of coffee and tea consumption and caffeine intake as risk factors for breast cancer are inconclusive. We assessed coffee and tea consumption, caffeine intake, and possible confounding factors among 42,099 women from the Swedish Women's Lifestyle and Health study, the participants of which were aged 30-49 years at enrollment in 1991-1992. Complete follow-up for breast cancer incidence was performed through 2012 via linkage to national registries. Poisson regression models were used to estimate relative risks (RRs) and 95% confidence intervals (CIs) for breast cancer. During follow-up 1,395 breast cancers were diagnosed. The RR was 0.97 (95% CI 0.94-0.99) for a 1-unit increase in cups of coffee/day, 1.14 (95% CI 1.05-1.24) for a 1-unit increase in cups of tea/day, and 0.97 (95% CI 0.95-1.00) for a 100 mg/day increase in caffeine intake. Although the RR for no consumption (RR = 0.86, 95% CI 0.69-1.08), a group with a relatively small number of women, was not statistically significant, women with higher consumption had a decreased breast cancer risk (3-4 cups/day: RR = 0.87, 95% CI 0.76-1.00; ≥5 cups/day: RR = 0.81, 95% CI 0.70-0.94) compared to women consuming 1-2 cups of coffee/day. Compared to no consumption, women consuming >1 cups tea/day showed an increased breast cancer risk (RR = 1.19, 95% CI 1.00-1.42). Similar patterns of estimates were observed for breast cancer risk overall, during pre- and postmenopausal years, and for ER+ or PR+ breast cancer, but not for ER- and PR- breast cancer. Our findings suggest that coffee consumption and caffeine intake is negatively associated with the risk of overall and ER+/PR- breast cancer, and tea consumption is positively associated with the risk of overall and ER+/PR+ breast cancer.


Sweden; breast cancer; caffeine; coffee; cohort; tea


[Looks like it's only an old abstract.]

Frequency of vitamin D (Vit D) deficiency at breast cancer (BC) diagnosis and association with risk of distant recurrence and death in a prospective cohort study of T1-3, N0-1, M0 B.

Goodwin PJ, Ennis M, Pritchard KI, Koo J, Hood N.

J Clin Oncol. 2008 May 20;26(15_suppl):511.

PMID: 27948561


511 Background: Vit D acts through a nuclear transcription factor to regulate many aspects of cellular growth and differentiation. Low levels have been associated with increased BC risk. We examined Vit D levels and prognostic effects in an existing BC cohort.


512 consecutive women with newly diagnosed BC were enrolled at 3 U of Toronto hospitals between 1989 and 1995. A blood specimen obtained at diagnosis was stored at -80°C. The Block questionnaire was used to measure diet intake. Clinical and pathology data were obtained from medical and pathology records. 25-OH Vit D was measured by radioimmunoassay. Women were followed prospectively to 2006.


Mean age was 50.4±9.7 yrs. 288 women had T1 tumors, 164 T2 and 24 T3/4. 356 tumors were N0. 342 were estrogen receptor (ER) positive. 73 tumors were grade 1, 202 grade 2 and 173 grade 3. 199 women received adjuvant chemotherapy (CXT) and 200 received tamoxifen. 116 women (22.7%) had distant recurrences and 106 (20.7%) died during a median follow-up of 11.6 yrs. Mean 25-OH Vit D was 58.1±23.4 nmol/L. Vit D levels were deficient (<50 nmol/L) in 192 (37.5%), insufficient (50-72 nmol/L) in 197 (38.5%) and adequate (>72 nmol/L) in 123 (24.0%). Low Vit D levels were associated with premenopausal status, high body mass index (BMI), high insulin and high tumor grade (all p≤0.03). Low Vit D levels were associated with low dietary intake of retinol, Vitamin E, grains and alcohol (all p<0.02). Vit D was marginally lower when drawn in winter (Oct-Mar) vs summer (Apr-Sept) months (56.7 vs 59.5 nmol/L, p=0.07). Distant disease-free survival (DDFS) was significantly worse in women with deficient (vs adequate) Vit D levels (HR 1.94, 95% CI 1.16-3.24, p=0.02) as was overall survival (OS) (HR 1.73, 95% CI 1.05-2.86, p=0.02). Vit D associations with DDFS were independent of age, BMI, insulin, T and N stage, ER and grade (all HR ≥1.55 Q1 vs Q4, all p ≤ 0.04); they were not significantly modified by ER, adjuvant CXT or tamoxifen. Vit D associations with OS were attenuated by grade and were absent in ER negative BC.


Vit D deficiency is common at BC diagnosis and is associated with poor prognosis. 


Soy Isoflavone Intake and Sleep Parameters over 5 Years among Chinese Adults: Longitudinal Analysis from the Jiangsu Nutrition Study.

Cao Y, Taylor AW, Zhen S, Adams R, Appleton S, Shi Z.

J Acad Nutr Diet. 2016 Dec 9. pii: S2212-2672(16)31298-9. doi: 10.1016/j.jand.2016.10.016. [Epub ahead of print]

PMID: 27956174



Soy isoflavone is beneficial for menopausal/postmenopausal symptoms, including sleep complaints. However, little is known about its longitudinal association with sleep in the general population.


Our aim was to investigate the association between soy isoflavone intake and sleep duration and daytime falling asleep among Chinese adults.


A longitudinal analysis was performed. Soy isoflavone intake was assessed by food frequency questionnaire. Sleep duration was self-reported at two time points. Occurrence of daytime falling asleep was determined at follow-up. Short and long sleep were defined as sleep <7 h/day or ≥9 h/day, respectively.


Adults aged 20 years and older from the Jiangsu Nutrition Study (2002-2007) with complete isoflavone intake and sleep duration data at both time points (n=1,474) were analyzed (follow-up, n=1,492).


We measured sleep duration in 2002 and 2007 and daytime falling asleep occurrence in 2007.


Mixed-effects logistic regression was performed for repeated measures between isoflavone intake and sleep duration. Logistic regression was performed for daytime falling asleep at follow-up. Demographic, anthropometric, and social factors were adjusted in the analyses.


The prevalence of long sleep duration was 18.9% in 2002 and 12.6% in 2007, and the prevalence of daytime falling asleep was 5.3%. Compared with the lowest quartile of isoflavone intake, the highest quartile was associated with a lower risk of long sleep duration (odds ratio=0.66; 95% CI 0.48 to 0.90; P for trend=0.018) over 5 years. Compared with persistent low intake of isoflavone (less than median intake of isoflavone at two time points), persistent high intake was associated with a reduced risk of daytime falling asleep in women (odds ratio=0.20; 95% CI 0.06 to 0.68), but not men. No consistent association between soy isoflavone intake and short sleep duration was found.


Soy isoflavone intake was associated with a low risk of long sleep duration in both sexes and a low risk of daytime falling asleep in women but not men.


Chinese adults; Daytime falling asleep; Long sleep duration; Longitudinal; Soy isoflavone


[intake of sugar, vegetables and fruit, and fibre were not associated with changes in CM risk factors in a statistically significant manner.]

Dietary intake and prospective changes in cardiometabolic risk factors in children and youth.

Setayeshgar S, Ekwaru JP, Maximova K, Majumdar SR, Storey KE, McGavock J, Veugelers PJ.

Appl Physiol Nutr Metab. 2016 Dec 13:1-7. [Epub ahead of print]

PMID: 27959641


Only few studies examined the effect of diet on prospective changes in cardiometabolic (CM) risk factors in children and youth despite its importance for understanding the role of diet early in life for cardiovascular disease in adulthood. To test the hypothesis that dietary intake is associated with prospective changes in CM risk factors, we analyzed longitudinal observations made over a period of 2 years among 448 students (aged 10-17 years) from 14 schools in Canada. We applied mixed effect regression to examine the associations of dietary intake at baseline with changes in body mass index, waist circumference (WC), systolic and diastolic blood pressure (SBP and DBP), and insulin sensitivity score between baseline and follow-up while adjusting for age, sex, and physical activity. Dietary fat at baseline was associated with increases in SBP and DBP z scores (per 10 g increase in dietary fat per day: β = 0.03; p < 0.05) and WC (β = 0.31 cm; p < 0.05) between baseline and follow-up. Every additional gram of sodium intake at baseline was associated with an increase in DBP z score of 0.04 (p < 0.05) between baseline and follow-up. Intake of sugar, vegetables and fruit, and fibre were not associated with changes in CM risk factors in a statistically significant manner. Our findings suggest that a reduction in the consumption of total dietary fat and sodium may contribute to the prevention of excess body weight and hypertension in children and youth, and their cardiometabolic sequelae later in life.


adolescents; cardiometabolic risks factors; diet; facteurs de risque cardiométabolique; longitudinal study; obesity; obésité; public health; régime alimentaire; santé publique; étude longitudinale


Combined statin and angiotensin-converting enzyme (ACE) inhibitor treatment increases the lifespan of long-lived F1 male mice.

Spindler SR, Mote PL, Flegal JM.

Age (Dordr). 2016 Sep 2. [Epub ahead of print]

PMID: 27590905


Statins, such as simvastatin, and ACE inhibitors (ACEis), such as ramipril, are standard therapies for the prevention and treatment of cardiovascular diseases. These types of drugs are commonly administered together. More recently, angiotensin II type 1 receptor (AT1R) antagonists, such as candesartan cilexetil (candesartan), have been used in the place of, or in combination with, ACEis. Here, we investigated the effects of simvastatin and ramipril single and combination therapy, and candesartan treatment on the lifespan of isocalorically fed, long-lived, B6C3F1 mice. Males were used for their relative endocrine simplicity and to minimize animal usage. The drugs were administered daily in food. The simvastatin and ramipril combination therapy significantly increased the mean and median lifespan by 9 %. In contrast, simvastatin, ramipril, or candesartan monotherapy was ineffective. All groups consumed the same number of calories. Simvastatin, alone or administered with ramipril, decreased body weight without changing caloric consumption, suggesting it may alter energy utilization in mice. Combination therapy elevated serum triglyceride and glucose levels, consistent with altered energy homeostasis. Few significant or consistent differences were found in mortality-associated pathologies among the groups. Simvastatin treatment did not reduce normal serum cholesterol or lipid levels in these mice, suggesting that the longevity effects may stem from the pleiotropic, non-cholesterol-related, effects of statins. Together, the results suggest that statins and ACEis together may enhance mouse longevity. Statins and ACE inhibitors are generally well-tolerated, and in combination, they have been shown to increase the lifespan of normotensive, normocholesterolemic humans.


ACE inhibitors; Angiotensin II receptor antagonists; Life span; Longevity; Statins


Basal body temperature as a biomarker of healthy aging.

Simonsick EM, Meier HC, Shaffer NC, Studenski SA, Ferrucci L.

Age (Dordr). 2016 Oct 26. [Epub ahead of print]

PMID: 27785691


Scattered evidence indicates that a lower basal body temperature may be associated with prolonged health span, yet few studies have directly evaluated this relationship. We examined cross-sectional and longitudinal associations between early morning oral temperature (95.0-98.6 °F) and usual gait speed, endurance walk performance, fatigability, and grip strength in 762 non-frail men (52 %) and women aged 65-89 years participating in the Baltimore Longitudinal Study of Aging. Since excessive adiposity (body mass index ≥35 kg/m2 or waist-to-height ratio ≥0.62) may alter temperature set point, associations were also examined within adiposity strata. Overall, controlling for age, race, sex, height, exercise, and adiposity, lower temperature was associated with faster gait speed, less time to walk 400 m quickly, and lower perceived exertion following 5-min of walking at 0.67 m/s (all p ≤ 0.02). In the non-adipose (N = 662), these associations were more robust (all p ≤ 0.006). Direction of association was reversed in the adipose (N = 100), but none attained significance (all p > 0.22). Over 2.2 years, basal temperature was not associated with functional change in the overall population or non-adipose. Among the adipose, lower baseline temperature was associated with greater decline in endurance walking performance (p = 0.006). In longitudinal analyses predicting future functional performance, low temperature in the non-adipose was associated with faster gait speed (p = 0.021) and less time to walk 400 m quickly (p = 0.003), whereas in the adipose, lower temperature was associated with slower gait speed (p = 0.05) and more time to walk 400 m (p = 0.008). In older adults, lower basal body temperature appears to be associated with healthy aging in the absence of excessive adiposity.


Aging; Body temperature; Excessive adiposity; Functional performance 


Using a polygenic score of DNA sequence polymorphisms, the authors of this study quantified genetic risk and assessed four healthy lifestyle factors. Among participants at high genetic risk, a healthy lifestyle was associated with a reduced risk of coronary disease.

Genetic Risk, Adherence to a Healthy Lifestyle, and Coronary Disease

Amit V. Khera, M.D., Connor A. Emdin, D.Phil., Isabel Drake, Ph.D., Pradeep Natarajan, M.D., Alexander G. Bick, M.D., Ph.D., Nancy R. Cook, Ph.D., Daniel I. Chasman, Ph.D., Usman Baber, M.D., Roxana Mehran, M.D., Daniel J. Rader, M.D., Valentin Fuster, M.D., Ph.D., Eric Boerwinkle, Ph.D., Olle Melander, M.D., Ph.D., Marju Orho-Melander, Ph.D., Paul M Ridker, M.D., and Sekar Kathiresan, M.D.

N Engl J Med 2016; 375:2349-2358 December 15, 2016 DOI: 10.1056/NEJMoa1605086



Both genetic and lifestyle factors contribute to individual-level risk of coronary artery disease. The extent to which increased genetic risk can be offset by a healthy lifestyle is unknown.


Using a polygenic score of DNA sequence polymorphisms, we quantified genetic risk for coronary artery disease in three prospective cohorts — 7814 participants in the Atherosclerosis Risk in Communities (ARIC) study, 21,222 in the Women’s Genome Health Study (WGHS), and 22,389 in the Malmö Diet and Cancer Study (MDCS) — and in 4260 participants in the cross-sectional BioImage Study for whom genotype and covariate data were available. We also determined adherence to a healthy lifestyle among the participants using a scoring system consisting of four factors: no current smoking, no obesity, regular physical activity, and a healthy diet.


The relative risk of incident coronary events was 91% higher among participants at high genetic risk (top quintile of polygenic scores) than among those at low genetic risk (bottom quintile of polygenic scores) (hazard ratio, 1.91; 95% confidence interval [CI], 1.75 to 2.09). A favorable lifestyle (defined as at least three of the four healthy lifestyle factors) was associated with a substantially lower risk of coronary events than an unfavorable lifestyle (defined as no or only one healthy lifestyle factor), regardless of the genetic risk category. Among participants at high genetic risk, a favorable lifestyle was associated with a 46% lower relative risk of coronary events than an unfavorable lifestyle (hazard ratio, 0.54; 95% CI, 0.47 to 0.63). This finding corresponded to a reduction in the standardized 10-year incidence of coronary events from 10.7% for an unfavorable lifestyle to 5.1% for a favorable lifestyle in ARIC, from 4.6% to 2.0% in WGHS, and from 8.2% to 5.3% in MDCS. In the BioImage Study, a favorable lifestyle was associated with significantly less coronary-artery calcification within each genetic risk category.


Across four studies involving 55,685 participants, genetic and lifestyle factors were independently associated with susceptibility to coronary artery disease. Among participants at high genetic risk, a favorable lifestyle was associated with a nearly 50% lower relative risk of coronary artery disease than was an unfavorable lifestyle. (Funded by the National Institutes of Health and others.)


The Human Intestinal Microbiome in Health and Disease

S.V. Lynch and O. Pedersen

The large majority of studies on the role of the microbiome in the pathogenesis of disease are correlative and preclinical; several have influenced clinical practice.


N Engl J Med 2016; 375:2369-2379December 15, 2016 DOI: 10.1056/NEJMra1600266

Human-associated microbes have primarily been viewed through the lens of a single species and its environment. Advances in culture-independent technologies have shown the enormous diversity, functional capacity, and age-associated dynamics of the human microbiome (see the Glossary). A large number of diverse microbial species reside in the distal gastrointestinal tract, and gut  microbiota dysbiosis — imbalances in the composition and function of these intestinal microbes — is associated with diseases ranging from localized gastroenterologic disorders to neurologic, respiratory, metabolic, hepatic, and cardiovascular illnesses. Much effort is currently concentrated on exploring potential causality and related microbiota-mediated disease mechanisms, with the hope that an improved understanding will fuel the conception and realization of novel therapeutic and preventive strategies.

Link to comment
Share on other sites

[it is just an abstract.]

Psychological wellbeing and all-cause mortality in the oldest old in China: a longitudinal survey-based study.

Gong E, Hua Y, Yan LL.

Lancet. 2016 Oct;388 Suppl 1:S22. doi: 10.1016/S0140-6736(16)31949-3.

PMID: 27968835



Although the relationship between psychological wellbeing and physical health has been the subject of many studies among middle-aged and older adults, little is known about whether psychological wellbeing is associated with mortality among the fastest growing population segment-the oldest old (aged ≥80 years).


This study included 18 676 adults aged 80-122 years from the Chinese Longitudinal Healthy Longevity Survey, conducted in 22 of 31 provinces in China in 1998 and followed up in 2000, 2002, 2005, 2008-2009, 2011-12, and 2014. Psychological wellbeing was measured by seven items covering positive (optimism, sense of personal control, conscientiousness, and positive feelings about ageing) and negative (loneliness, anxiety, and loss of self-worth) affects with a five-point Likert scale. A psychological wellbeing index was constructed from the sum of these seven items, and scores were divided by quartile (Q1 0-22, Q2=23-25, Q3 26-28, and Q4 28-35), with higher scores indicating better wellbeing. The association between psychological wellbeing and all-cause mortality was evaluated with multivariable Cox proportional hazards regressions. Duke University Health System's Institutional Review Board, the National Bureau of Statistics of China, and the Ethical Committee of the Social Science Division of Peking University reviewed and approved ethics for the Chinese Longitudinal Healthy Longevity Survey study. Written consent was obtained from all participants or their proxies.


The mean age of the study population (58·8% women, n=10977) at baseline was 92 years (SD 7·4). Most participants (89·2%, n=16 661) died during follow-up, which ranged from 0·1 to 16·5 years (median 2·8). Compared with participants with a Q1 wellbeing score, hazard ratios for death were 0·95 (0·91 -0·99) for those with a Q2 score, 0·90 (0·86-0·94) for Q3, and 0·84 (0·79 -0·88) for Q4 (all p values <0·0001) after adjustment for potential confounders, including age, sex, co-residence, residence, education, marital status, lifestyle, and self-reported health. Similar patterns were found in stratified analysis among people with and without chronic diseases (hypertension, diabetes, cardiovascular diseases, stroke, and respiratory diseases).


In a large sample of Chinese oldest old, we found a dose-response association between psychological wellbeing and all-cause mortality. Our results, if substantiated by future research, suggest that psychological factors in older age might have a role in longevity.


  Marital History and Survival After Stroke

  Matthew E. Dupre and Renato D. Lopes

  J Am Heart Assoc. 2016;5:e004647. Originally published December 14, 2016.

  doi: 10.1161/JAHA.116.004647 



Stroke is among the leading causes of disability and death in the United States, and nearly 7 million adults are currently alive after experiencing a stroke. Although the risks associated with having a stroke are well established, we know surprisingly little about how marital status influences survival in adults with this condition. This study is the first prospective investigation of how marital history is related to survival after stroke in the United States.

Methods and Results 

Data from a nationally representative sample of older adults who experienced a stroke (n=2351) were used to examine whether and to what extent current marital status and past marital losses were associated with risks of dying after the onset of disease. Results showed that the risks of dying following a stroke were significantly higher among the never married (hazard ratio {HR}, 1.55; 95% CI, 1.15–2.08), remarried (HR, 1.22; 95% CI, 1.05–1.43), divorced (HR, 1.22; 95% CI, 1.01–1.50), and widowed (HR, 1.32; 95% CI, 1.16–1.51) relative to those who remained continuously married. We also found that having multiple marital losses was especially detrimental to survival—regardless of current marital status and accounting for multiple socioeconomic, psychosocial, behavioral, and physiological risk factors.


Marital history is significantly associated with survival after stroke. Additional studies are needed to further examine the mechanisms contributing to the associations and to better understand how this information can be used to personalize care and aggressively treat vulnerable segments of the population.

marital statusmortalitystroke


Marital history 1971-91 and mortality 1991-2004 in England & Wales and Finland.

Blomgren J, Martikainen P, Grundy E, Koskinen S.

J Epidemiol Community Health. 2012 Jan;66(1):30-6. doi: 10.1136/jech.2010.110635.

PMID: 20924052



Little is known about the effects of long-term marital history on mortality, and the relative importance of using marital history instead of baseline marital status in mortality analyses. No previous comparative studies on the associations of marital history and mortality exist.


Longitudinal data from England & Wales and from Finland were used to assess the effects of marital history, constructed from census records from years 1971, 1981 and 1991, on all-cause mortality in 1991-2004 among men and women aged ≥ 50 years. Data from England & Wales include 57,492 deaths; data from Finland include 424,602 deaths. Poisson regression analysis was applied.


Adding marital history into models including baseline marital status was statistically significant when explaining male mortality, while it was generally not important for female mortality. Adjusted for socio-demographic covariates, those consistently married with no record of marital break-up had the lowest mortality rates among both men and women aged 50-74 in both countries. Those never married, those divorced with a history of divorce and those widowed with a history of widowhood showed the highest mortality risks. Associations between marital history and mortality were weaker among those aged 75+.


Consistent evidence in favour of both protective effects of long-lasting marriage and detrimental effects of marital dissolution were found. Studies would benefit from including marital history in the models instead of baseline marital status whenever possible, especially when studying male mortality.

Edited by AlPater
Link to comment
Share on other sites


Poor clock management and cancer

Paula A. Kiberstis

For most people, sleeping and waking on a regular schedule is an aspiration rather than a reality. Unfortunately, it is becoming clear that chronic disruption of the circadian clock, or “social jet lag,” can pose health risks. Kettner et al. studied mice to explore how jet lag affects liver function experimentally by varying the times at which lights were switched on and off each week. Despite a healthy diet, the jet-lagged mice gained weight and developed fatty liver disease, which progressed to fibrosis and in some cases to hepatocellular carcinoma, a form of liver cancer. The livers of these mice showed marked dysregulation of metabolic pathways controlled by two specific nuclear receptors, FXR and CAR.

Circadian Homeostasis of Liver Metabolism Suppresses Hepatocarcinogenesis.

Kettner NM, Voicu H, Finegold MJ, Coarfa C, Sreekumar A, Putluri N, Katchy CA, Lee C, Moore DD, Fu L.

Cancer Cell. 2016 Dec 12;30(6):909-924. doi: 10.1016/j.ccell.2016.10.007.

PMID: 27889186


Chronic jet lag induces spontaneous hepatocellular carcinoma (HCC) in wild-type mice following a mechanism very similar to that observed in obese humans. The process initiates with non-alcoholic fatty liver disease (NAFLD) that progresses to steatohepatitis and fibrosis before HCC detection. This pathophysiological pathway is driven by jet-lag-induced genome-wide gene deregulation and global liver metabolic dysfunction, with nuclear receptor-controlled cholesterol/bile acid and xenobiotic metabolism among the top deregulated pathways. Ablation of farnesoid X receptor dramatically increases enterohepatic bile acid levels and jet-lag-induced HCC, while loss of constitutive androstane receptor (CAR), a well-known liver tumor promoter that mediates toxic bile acid signaling, inhibits NAFLD-induced hepatocarcinogenesis. Circadian disruption activates CAR by promoting cholestasis, peripheral clock disruption, and sympathetic dysfunction.


cholestasis; chronic circadian disruption; constitutive androstane receptor (CAR); farnesoid X receptor (FXR); fibrosis; hepatocarcinogenesis; non-alcoholic fatty liver disease; non-alcoholic steatohepatitis; social jet lag; sympathetic dysfunction


Proportion of invasive breast cancer attributable to risk factors modifiable after menopause.

Sprague BL, Trentham-Dietz A, Egan KM, Titus-Ernstoff L, Hampton JM, Newcomb PA.

Am J Epidemiol. 2008 Aug 15;168(4):404-11. doi: 10.1093/aje/kwn143.

PMID: 18552361 Free PMC Article


A number of breast cancer risk factors are modifiable later in life, yet the combined impact of the population changes in these risk factors on breast cancer incidence is not known to have been evaluated. The population attributable risk (PAR) associated with individual risk factors and the summary PAR for sets of modifiable and nonmodifiable risk factors were estimated by using data on 3,499 invasive breast cancer cases and 4,213 controls from a population-based study in Wisconsin, Massachusetts, and New Hampshire, conducted from 1997 to 2001. The summary PAR for factors modifiable after menopause, including current postmenopausal hormone use, recent alcohol consumption, adult weight gain, and recent recreational physical activity, was 40.7%. Of the individual modifiable factors, the highest PARs were observed for weight gain (21.3%) and recreational physical activity (15.7%), which together showed a summary PAR of 33.6%. The summary PAR for factors not modifiable after menopause, including family history of breast cancer, personal history of benign breast disease, height at age 25 years, age at menarche, age at menopause, age at first birth, and parity, was 57.3%. These findings suggest that a substantial fraction of postmenopausal breast cancer may be avoided by purposeful changes in lifestyle later in life.


Population Attributable Risk of Modifiable and Nonmodifiable Breast Cancer Risk Factors in Postmenopausal Breast Cancer.

Tamimi RM, Spiegelman D, Smith-Warner SA, Wang M, Pazaris M, Willett WC, Eliassen AH, Hunter DJ.

Am J Epidemiol. 2016 Dec 6. [Epub ahead of print]

PMID: 27923781


We examined the proportions of multiple types of breast cancers in the population that were attributable to established risk factors, focusing on behaviors that are modifiable at menopause. We estimated the full and partial population attributable risk percentages (PAR%) by combining the relative risks and the observed prevalence rates of the risk factors of interest. A total of 8,421 cases of invasive breast cancer developed in postmenopausal women (n = 121,700) in the Nurses' Health Study from 1980-2010. We included the following modifiable risk factors in our analyses: weight change since age 18 years, alcohol consumption, physical activity level, breastfeeding, and menopausal hormone therapy use. Additionally, the following nonmodifiable factors were included: age, age at menarche, height, a combination of parity and age at first birth, body mass index at age 18 years, family history of breast cancer, and prior benign breast disease. When we considered all risk factors (and controlled for age), the PAR% for invasive breast cancers was 70.0% (95% confidence interval: 55.0, 80.7). When considering only modifiable factors, we found that changing the risk factor profile to the lowest weight gain, no alcohol consumption, high physical activity level, breastfeeding, and no menopausal hormone therapy use was associated with a PAR% of 34.6% (95% confidence interval: 22.7, 45.4). The PAR% for modifiable factors was higher for estrogen receptor-positive breast cancers (PAR% = 39.7%) than for estrogen receptor-negative breast cancers (PAR% = 27.9%). Risk factors that are modifiable at menopause account for more than one-third of postmenopausal breast cancers; therefore, a substantial proportion of breast cancer in the United States is preventable.


PAR%; modifiable factors; postmenopausal breast cancer


Regional Variation in Out-of-Hospital Cardiac Arrest Survival in the United States.

Girotra S, van Diepen S, Nallamothu BK, Carrel M, Vellano K, Anderson ML, McNally B, Abella BS, Sasson C, Chan PS; CARES Surveillance Group and the HeartRescue Project..

Circulation. 2016 May 31;133(22):2159-68. doi: 10.1161/CIRCULATIONAHA.115.018175.

PMID: 27081119



Although previous studies have shown marked variation in out-of-hospital cardiac arrest survival across US regions, factors underlying this survival variation remain incompletely explained.


Using data from the Cardiac Arrest Registry to Enhance Survival, we identified 96 662 adult patients with out-of-hospital cardiac arrest in 132 US counties. We used hierarchical regression models to examine county-level variation in rates of survival and survival with functional recovery (defined as Cerebral Performance Category score of 1 or 2) and examined the contribution of demographics, cardiac arrest characteristics, bystander cardiopulmonary resuscitation, automated external defibrillator use, and county-level sociodemographic factors in survival variation across counties. A total of 9317 (9.6%) patients survived to discharge, and 7176 (7.4%) achieved functional recovery. At a county level, there was marked variation in rates of survival to discharge (range, 3.4%-22.0%; median odds ratio, 1.40; 95% confidence interval, 1.32-1.46) and survival with functional recovery (range, 0.8%-21.0%; median odds ratio, 1.53; 95% confidence interval, 1.43-1.62). County-level rates of bystander cardiopulmonary resuscitation and automated external defibrillator use were positively correlated with both outcomes (P<0.0001 for all). Patient demographic and cardiac arrest characteristics explained 4.8% and 27.7% of the county-level variation in survival, respectively. Additional adjustment of bystander cardiopulmonary resuscitation and automated external defibrillator explained 41% of the survival variation, and this increased to 50.4% after adjustment of county-level sociodemographic factors. Similar findings were noted in analyses of survival with functional recovery.


Although out-of-hospital cardiac arrest survival varies significantly across US counties, a substantial proportion of the variation is attributable to differences in bystander response across communities.


cardiopulmonary resuscitation; heart arrest


County-Level Variation in Cardiovascular Disease Mortality in the United States in 2009-2013: Comparative Assessment of Contributing Factors.

Patel SA, Ali MK, Narayan KM, Mehta NK.

Am J Epidemiol. 2016 Nov 17. [Epub ahead of print]

PMID: 27864183


We examined factors responsible for variation in cardiovascular disease (CVD) mortality across US counties in 2009-2013. We linked county-level census, survey, administrative, and vital statistics data to examine 4 sets of features: demographic factors, social and economic factors, health-care utilization and features of the environment, and population health indicators. County-level associations of these features (standardized to a mean of 0 with a standard deviation of 1) with cardiovascular deaths per 100,000 person-years among adults aged 45-74 years was modeled using 2-level hierarchical linear regression with random intercept for state. The percentage of CVD mortality variation (intercounty disparity) modeled by each set of features was quantified. Demographic composition accounted for 36% of county CVD mortality variation, and another 32% was explained after inclusion of economic/social conditions. Health-care utilization, features of the environment, and health indicators explained an additional 6% of CVD mortality variation. The largest contributors to CVD mortality levels were median income (-23.61 deaths/100,000 person-years, 95% CI: -26.95, -20.26) and percentage without a high school education (20.71 deaths/100,000 person-years, 95% CI: 16.48, 24.94). In comparison, the largest health-related contributors were health-care utilization (19.35 deaths/100,000 person-years, 95% CI: 16.36, 22.34) and CVD risk factors (4.80 deaths/100,000 person-years, 95% CI: 2.14, 7.46). Improving health-care access and decreasing the prevalence of traditional CVD risk factors may reduce county CVD mortality levels, but improving socioeconomic circumstances of disadvantaged counties will be required in order to reduce CVD mortality disparities across counties.


cardiovascular diseases; disparities; mortality; social determinants of health


[it is an abstract only.]

Primary androgen deprivation (PAD) followed by active surveillance for newly diagnosed prostate cancer (PC).

Scholz MC, Groom M, Kaddis A, Lam RY, Jennrich R.

J Clin Oncol. 2012 Feb 10;30(5_suppl):244.

PMID: 27968205



Background: Men undergoing local therapy (LT) with radiation or surgery incur substantial risk of permanent sexual, urinary and rectal toxicity. Active surveillance is only recommended for men with Low-Risk PC who have < 34% biopsy cores positive. Studies of PAD as an alternative to LT are sparse.


Retrospective analysis of 102 men with localized PC administered 12 months of an LHRH agonist and antiandrogen (PAD) followed by immediate biopsy (BX1). Outcome assessed: The incidence of implementing further treatment with either LT or androgen deprivation (AD) if there was subsequent progression on active surveillance (PAS) or if BX1 was positive. Post PAD monitoring included quarterly PSA testing, follow-up biopsy and imaging with color doppler ultrasound and endorectal MRI. Participants were screened for heart disease and osteoporosis and monitored for changes in body weight, BP, liver enzymes, anemia, blood sugar and cholesterol. All received empiric bisphosphonates and 5-alpha reductase inhibitors and were instructed to eat a low fat diet and perform resistance exercise.


Patient characteristics: D'Amico risk category (DRC) Low: n=22; Intermediate: n=30; High: n= 50. Baseline factors assessed as potential predictors of PAS: Age, PSA, PSA velocity, PSA nadir, Gleason, % core biopsies, stage, prostate volume, DRC. Baseline medians for the 102 men: Age 67.3, PSA 7.8, Gleason 3+4, Cores > 50%, stage T1c, prostate volume 45cc. 55 men had PAS and required LT and/or AD over a median of 7.3 years follow up. Type of treatment administered after PAS by DRC: Low: AD 1, LT 3. Intermediate: AD 5, LT 10. High: AD 15, LT 21. Two men developed clinical progression. There were no PC deaths. Four men died of other causes: lung CA, melanoma, emphysema and MI. Only DRC (p<0.01) and PSA nadir (p<0.04) predicted PAS (3 men had PSA nadir > 0.1 and all were positive at BX1).


PAD frequently induces durable remissions in men with Low and Intermediate risk PC despite a high prevalence of > 50% core biopsies positive at baseline in this population. 38% of the 55 men with PAD elected additional AD rather than undergoing LT suggesting that for some men enduring the toxicity profile of AD was preferable to the perceived risks of LT.


[it is an abstract only.]

Cancer aggressiveness and mortality in men of exceptional age.

Yung RL, Kurth T, Gaziano JM, Driver JA.

J Clin Oncol. 2009 May 20;27(15_suppl):11051.

PMID: 27963157



Background: Information on the characteristics of cancer in people ≥ 85 is limited, particularly in men.


We evaluated the type, grade and extent of cancer among the 22,071 men in the Physicians' Health Study by age at diagnosis (dx) (<65, 65-74, 75-84 and ≥85). All cases of cancer, deaths and cause of death were confirmed by medical record review. To investigate the relationship between age at dx and risk of cancer death, we matched newly diagnosed cancer patients to reference subjects by age and a modified Charlson comorbidity score. Participants were followed for all cause mortality. We estimated hazard ratios (HR) for death by age at dx using Cox proportional hazards models and adjusted for potential confounders.


Over a mean follow-up of 20.5 years, 5,623 incident cancers were confirmed. Prostate cancer remained the most common cancer across all age groups. Melanoma and lung cancer became less common with age, while unknown cancers and gastrointestinal cancers other than colorectal (other GI) became more common. There was no linear trend toward higher or lower grade across the four age groups for individual cancer types. For men ≥ 85 the frequency of metastatic cancer at dx increased for prostate (5.8% vs 14.6% p=0.01) and decreased for other GI tumors (63.8% vs 43.5% p=0.05). Cancer as a cause of death decreased among the entire cohort from 44.1% in men aged 55-64 to 20.5% in men ≥ 85, and among those with cancer it decreased from 93.6% to 52.8%. In the matched cohort analysis, the HR for death from all cancers combined declined markedly across categories of increasing age at cancer dx from 10.9 (95%CI:6.0-19.9) in men < 55 to 1.9 (95%CI:1.5-2.4) in men ≥ 85. There was a similar decline in the HR with increasing age for cancer death from lymphoma, melanoma, prostate and colorectal cancers, whereas the HR of lung, other GI and urinary tumors remained stable.


In this prospective cohort of apparently healthy U.S. male physicians, characteristics of cancer in men ≥ 85 varied considerably with tumor type and may reflect changes in cancer detection or biology with age. Cancer specific mortality decreased markedly with increasing age of diagnosis for most cancers. This is likely explained by competing risks of death which outpace that of cancer, but may also suggest decreased cancer aggressiveness in advanced age. 


The association of serum magnesium and mortality outcomes in heart failure patients: A systematic review and meta-analysis.

Angkananard T, Anothaisintawee T, Eursiriwan S, Gorelik O, McEvoy M, Attia J, Thakkinstian A.

Medicine (Baltimore). 2016 Dec;95(50):e5406.

PMID: 27977579



Low serum magnesium (Mg) has been independently shown to increase risk of heart failure (HF), but data on the association between serum Mg concentration and the outcome of patients with HF are conflicting. The purpose of this systematic review and meta-analysis was to estimate the prognostic effects of hypermagnesemia and hypomagnesemia on cardiovascular (CV) mortality and all-cause mortality (ACM) of patients with HF.


Relevant studies were identified from Medline and Scopus databases. Included and excluded criteria were defined. Effects (i.e., log [risk ratio [RR]]) of hypomagnesemia and hypermagnesemia versus normomagnesemia were estimated using Poisson regression, and then a multivariate meta-analysis was applied for pooling RRs across studies. Heterogeneity was explored using a meta-regression and subgroup analysis.


On analysis, 7 eligible prospective studies yielded a total of 5172 chronic HF patients with 913 and 1438 deaths from CV and ACM, respectively. Most participants were elderly men with left ventricular (LV) ejection fraction ≤40%. Those patients with baseline hypermagnesemia had a significantly higher risk of CV mortality (RR, 1.38; 95% confidence interval [CI], 1.07-1.78) or ACM (RR, 1.35; 95% CI, 1.18-1.54) than those with baseline normomagnesemia. However, baseline hypomagnesemia was not associated with the risk of CV mortality (RR, 1.11; 95% CI, 0.79-1.57) and ACM (RR, 1.11; 95% CI, 0.87-1.41). A subgroup analysis by Mg cutoff suggested a dose-response trend for hypermagnesemia effects, that is, the pooled RRs for CV mortality were 1.28 (95% CI, 1.05-1.55) and 1.92 (95% CI, 1.00-3.68) for the cutoff of 0.89 to 1.00 and 1.05 to 1.70 mmol/L, respectively.


The present systematic review and meta-analysis suggested that, in HF patients, hypermagnesemia with serum Mg ≥ 1.05 mmol/L was associated with an increased risk of CV mortality and ACM but this was not observed for hypomagnesemia. This finding was limited to the elderly patients with chronic HF who had reduced LV systolic function.


[it is an abstract only.]

A prospective study of reproductive factors, hormone use, and risk of lung cancer in postmenopausal women.

Baik CS, Strauss GM, Feskanich D.

J Clin Oncol. 2009 May 20;27(15_suppl):1501.

PMID: 27964314



Background: There has been increased interest in understanding the role of hormonal factors in lung cancer (LC) in women with the observation that it exhibits different epidemiologic patterns and treatment response when compared to men. However, results of published studies have been inconsistent, possibly due to inadequate smoking adjustment.


We prospectively examined the associations between reproductive factors, exogenous hormone use, and LC incidence in 106,574 postmenopausal women in the Nurses' Health Study. Participants completed biennial questionnaires which included updated smoking history. We assessed age at menopause, age at menarche, type of menopause, parity, postmenopausal hormone (PMH), and oral contraceptive use. Cox proportional hazards modeling was used to estimate the relative risks (RR) of each exposure, adjusted for smoking status, number of cigarettes, time since quitting, age of initiating smoking, fruit/vegetable intake, body mass index, and environmental smoking exposure.


We identified 1,565 LC cases during follow up from 1984 to 2004. Parity was associated with decreased LC risk in never smokers (RR = 0.54, 95% CI 0.31-0.96) but increased risk in current smokers (RR = 1.44, 95% CI 1.03-2.02). No association was seen in former smokers. Also, younger age at menopause was associated with higher LC risk in women with natural menopause (p-trend = 0.016). PMH use was not associated with LC incidence. The RR for current PMH users was 1.01 (95% CI 0.87-1.17) and for past users was 0.95 (95% CI 0.82-1.1). No significant association was seen when assessed by duration of PMH use, time since last use, or type of PMH. However, past use of oral contraceptives for greater than 5 years was associated with increased LC risk (RR = 1.2, 95% CI 1.03-1.41).


These results suggest that there may be an association between hormonal factors and LC development, and further suggest that the mechanism may differ in smokers versus lifelong never smokers. 


[it is an abstract only.]

Serum calcium and incident and fatal prostate cancer in the Swedish AMORIS study.

Van Hemelrijck M, Hermans R, Michaelsson K, Garmo H, Hammar N, Jungner I, Walldius G, Lambe M, Holmberg L.

J Clin Oncol. 2012 Feb 10;30(5_suppl):36.

PMID: 27967843


36 Background: Many observational studies have shown a positive association between intake of dairy products and prostate cancer (PCa) risk. From a biological point of view it is of interest to study this association as bone was recently shown to be a positive regulator of male fertility which suggests that regulation of bone remodeling and reproduction are linked. Since androgens promote cell proliferation and inhibit prostate cell death, it is possible that calcium (Ca) is linked to PCa risk via its link with the reproductive system. We studied the association between serum Ca and PCa while also accounting for levels of albumin, a protein to which Ca is bound.


A cohort based on 192,183 men with baseline information on Ca (mmol/L) and albumin (g/L) was selected from the Swedish Apolipoprotein MOrtality RISk (AMORIS) study. Age-stratified multivariable Cox proportional hazard models were used to analyze associations between Ca and incident and fatal PCa risk. All models were adjusted for fasting status, glucose levels, socio-economic status, season at time of Ca measurement, Charlson comorbidity index, and history of fractures.


A 6,202 men were diagnosed with PCa and 672 died of PCa during mean follow-up of 12 years. A weak negative association was found between PCa risk and Ca (HR per SD: 0.97 (95%CI: 0.95-1.00)). A similar association was also found between albumin-corrected Ca and PCa risk (HR: 0.96 (0.89-1.03), 0.94 (0.87-1.01), and 0.92 (0.86-0.99) for the 2nd, 3rd, and 4th quartile compared to the 1st; P for trend: 0.02). No association was found with fatal PCa, nor was there effect-modification by overweight. A strong positive association between Ca and death was observed when censoring for PCa (HR per SD: 1.13 (95%CI: 1.12-1.15)).


Serum levels of Ca were weakly negatively associated with PCa risk in our study when adjusted for age and history of comorbidities and fractures. A negative association between Ca and PCa risk is likely explained by the strong relation between Ca and non-PCa death. These competing risks need to be handled in order to define whether Ca is causally involved in PCa aetiology or whether it only acts a marker of other metabolic events in the causal pathway.


[it is an abstract only.]

Rye bread consumption in early life and reduced risk of advanced prostate cancer.

Torfadottir JE, Valdimarsdottir UA, Mucci L, Stampfer MJ, Kasperzyk J, Fall K, Tryggvadottir L, Aspelund T, Olafsson O, Harris TB, Jonsson E, Tulinus H, Adami HO, Gudnason V, Steingrimsdottir L.

J Clin Oncol. 2012 Feb 10;30(5_suppl):79.

PMID: 27967881



Background: Prior evidence suggests that rye consumption may reduce risk of prostate cancer (PCa). Our aim is to study whether consumption of rye bread and oatmeal (sources of lignans and fiber), both in early- and midlife, is associated with risk of PCa.


In 2002 to 2006, 2,268 men, aged 67-96 years, reported their dietary habits in the AGES-Reykjavik cohort study. Dietary habits were assessed retrospectively, for early and midlife using a validated food frequency questionnaire (FFQ). Through linkage to cancer- and mortality registers, we retrieved information on PCa diagnosis and mortality through 2009. We used logistic regression to estimate odds ratios (ORs) for PCa according to rye- and oatmeal consumption, adjusted for possible confounding factors including fish-, fish liver oil-, meat-, and milk intake.


Of the 2268 men, 347 had or were diagnosed with PCa during follow-up, 63 with advanced disease (stage 3+ or died of PCa). Daily rye bread consumption in adolescence (vs. less than daily) was associated with a decreased risk of PCa diagnosis (OR 0.78; 95% CI: 0.60, 1.00), and of advanced PCa (OR 0.53; 95% CI: 0.32, 0.93). High intake of oatmeal in adolescence (≥5 vs. ≤4 times/ week) was not significantly associated with risk of PCa diagnosis (OR 1.00; 95% CI: 0.78, 1.29) nor advanced PCa (OR 0.73; 95% CI: 0.41, 1.31). Midlife consumption of rye or oatmeal was not associated with PCa risk.


Our data suggest that frequent rye bread consumption in adolescence may be associated with reduced risk of PCa, particularly advanced disease.


[it is an abstract only.]

Diet, supplements, and lifestyle factors and risk of progression in contemporary active surveillance patients.

Trock BJ, Feng Z, Landis P, Carter B.

J Clin Oncol. 2012 Feb 10;30(5_suppl):138.

PMID: 27968160


138 Background: Men diagnosed with prostate cancer frequently change their diet or lifestyle in an effort to decrease their risk of developing aggressive disease. Men managed with active surveillance (AS) may feel a greater need to modify risk factors because their cancer is untreated. However, no data on post-diagnosis diet and lifestyle factors from large prospective cohorts with long-term follow-up have been reported. We evaluated the influence of diet and lifestyle on the risk of prostate cancer biopsy progression in the largest prospective AS cohort in the US.


Diet and lifestyle questionnaire completed by AS participants at enrollment. Progression determined by occurrence of Gleason score>7 or increase in tumor volume at annual surveillance biopsy. Analysis focused on 28 nutrients, 10 food groups, 9 supplements, 7 medication variables, and 2 lifestyle variables with a priori hypotheses. Data were analyzed by Wilcoxen test and logistic regression, with calorie adjustment by the residual method.


There were 723 men in the analysis, of whom 187 (26%) progressed. Median follow-up was 2.7 years. Men who progressed were significantly older, had higher PSA density, and higher biopsy tumor burden at diagnosis. In multivariable analyses, men who progressed had higher PSA density (OR=1.34 per 0.1 ng/ml/cc, p=0.017), were more likely to have >1 positive biopsy core (OR=2.9, p<0.0001), had higher percentage core involved with tumor (OR=1.04, p<0.0001), and were more likely to be current smokers (OR=4.1, p=0.004); duration of NSAID use and history of prostatitis approached significance.


Cigarette smoking was significantly associated with risk of progression. This association must be viewed with caution due to the large number of variables tested. It is likely that many men considered to have progressed represent undersampling of the initial biopsy, rather than true biologic progression; this misclassification may obscure risk relationships. Ongoing analyses will be presented that explore whether associations differ for early vs late progression events. We believe this is the first report that smoking is associated with progression in men managed by AS.


[it is an abstract only.]

Vitamin E and the risk of prostate cancer: Updated results of the Selenium and Vitamin E Cancer Prevention Trial (SELECT).

Klein EA, Thompson I, Tangen CM, Lucia MS, Goodman P, Minasian LM, Ford LG, Parnes HL, Gaziano JM, Karp DD, Lieber MM, Walther PJ, Parsons JK, Chin J, Darke AK, Lippman SM, Goodman GE, Meyskens FL, Baker LH.

J Clin Oncol. 2012 Feb 10;30(5_suppl):7.

PMID: 27968278


Background: The initial report of the Selenium and Vitamin E Cancer Prevention Trial (SELECT) found no reduction in risk of prostate cancer with either selenium or vitamin E supplements but a non-statistically significant increase in prostate cancer risk with vitamin E. Longer follow-up and more prostate cancer events provide further insight into the relationship of vitamin E and prostate cancer.


SELECT randomized 35,533 men from 427 study sites in the United States, Canada and Puerto Rico in a double-blind manner between August 22, 2001 and June 24, 2004. Eligible men were 50 years or older (African Americans) or 55 years or older (all others) with a PSA <4.0 ng/mL and a digital rectal examination not suspicious for prostate cancer. Included in the analysis are 34,887 men randomly assigned to one of four treatment groups: selenium (n=8752), vitamin E (n=8737), both agents (n=8702), or placebo (n=8696). Data reflect the final data collected by the study sites on their participants through July 5, 2011.


This report includes 54,464 additional person-years of follow-up since the primary report. Hazard ratios (99% confidence intervals [CI]) and numbers of prostate cancers were 1.17 (99% CI 1.004-1.36, p=.008, n=620) for vitamin E, 1.09 (99% CI 0.93-1.27, p=.18, n=575) for selenium, 1.05 (99%CI 0.89-1.22, p=.46, n=555) for selenium + vitamin E vs. 1.00 (n=529) for placebo. The absolute increase in risk compared with placebo for vitamin E, selenium and the combination were 1.6, 0.9 and 0.4 cases of prostate cancer per 1,000 person-years.


Dietary supplementation with Vitamin E significantly increases the risk of prostate cancer among healthy men.


[it is an abstract only.]

Lipid profiles and the risk of kidney cancer in the Swedish AMORIS study.

Van Hemelrijck M, Garmo H, Hammar N, Walldius G, Lambe M, Jungner I, Holmberg L.

J Clin Oncol. 2011 Mar;29(7_suppl):342.

PMID: 27968678Abstract


Background: Since multiple epidemiologic studies showed a link between obesity and kidney cancer (KCa), the lipid metabolism is thought to play a role in development of KCa. With the exception of cholesterol and total fat intake, the association between changes in lipid biomarkers and KCa has not often been researched. We assessed the link between lipid profiles and KCa risk in a large prospective cohort study.


A cohort based on 85,261 persons (> 20 years old) with baseline measurements of glucose, triglycerides (TG), total cholesterol, HDL, LDL, apolipoprotein A-I and apoB was selected from the Swedish Apolipoprotein Mortality Risk (AMORIS) study. Multivariate Cox proportional hazards models were used to analyze associations between quartiles and dichotomized values of these lipid components and KCa risk. All models were adjusted for age, gender, socioeconomic status, fasting status, history of kidney disease prior to baseline (ICD9: 580-93), and glucose, cholesterol, and TG levels (depending on the covariate of interest).


During a mean follow-up of 12 years, 161 persons developed KCa (58% men). The mean age at baseline was 46 years. TG were the only lipid component for which a statistically significant association was found with risk of KCa (Hazard Ratio (HR): 1.05 (95%CI: 0.59-1.87), 1.77 (1.05-2.98), and 1.77 (1.04-3.02) for the second, third, and fourth quartile, compared to the first, with p-value for trend: 0.008). The lipid ratio of TG and HDL also showed a statistically significant positive association with risk of KCa (HR: 1.21 (0.71-2.08), 1.56 (0.94-2.58), and 1.92 (1.17-3.17) for the second, third, and fourth quartile, compared to the first, with p-value for trend: 0.004). No other associations were found between lipid components and KCa risk.


This detailed analysis of lipid components and risk of KCa found a relation between levels of TG and KCa risk. In contrast to previous studies, we did not find an association between cholesterol levels and KCa risk. Lipid profiles based on the markers used in this study do not seem to reflect the etiological pathway that has previously been shown between obesity and KCa. Further mechanistic studies are required to assess the link between lipid deregulation and KCa. 


[The below paper is not pdf-availed.]

Fish consumption and omega-3 fatty acid supplementation for prevention or treatment of cognitive decline, dementia or Alzheimer's disease in older adults - any news?

Cederholm T.

Curr Opin Clin Nutr Metab Care. 2016 Dec 12. [Epub ahead of print]

PMID: 27977429



Twenty years of research indicates that fish and n-3 fatty acids (FAs), for example docosahexaenoic acid, may attenuate cognitive decline including Alzheimer's disease in older people. This review concerns reports during 2015-2016 in humans.


One prospective cohort study showed that seafood consumption was related to less neuritic plaques and neurofibrillary tangles in brain autopsies from elderly care residents. In a large 5-year intervention no effects on cognition could be shown either in n-3 FA supplemented or in control patients. Two meta-analyses in community-dwelling patients support preservation of cognition with higher fish intake. Older adults with memory complaints may improve cortical blood flow during memory challenges by n-3 FA supplementation. Recalculations from a report in Alzheimer's disease patients indicated a dose-response pattern between increments of serum n-3 FAs and cognitive improvement. Still, a Cochrane review (using three randomized control trials) concluded that n-3 FAs cannot provide any 6-month benefit in patients with mild/moderate Alzheimer's disease.


The accumulated knowledge indicates that healthy populations may have preventive benefits from fish and docosahexaenoic acid intake, like older adults with memory complaints/mild cognitive impairment, and maybe subgroups of patients with mild/moderate Alzheimer's disease may also show such benefits. Still, more studies are needed.


Deep Breaths for Memory

Posted on Dec. 15, 2016, 6 a.m. in Brain and Mental Performance Respiratory

Rhythm of breathing impacts electrical activity in the brain that improves emotional judgment and memory recall.

 Deep Breaths for Memory

Hippocampus - image from Shutterstock

Scientists at Northwestern Medicine have discovered that the rhythm of breathing creates electrical activity in the human brain that enhances memory recall and emotional judgments.

Christina Zelano, the lead author of this study and an assistant professor of neurology at the Feinberg School of Medicine at Northwestern University, stated that there is a dramatic difference in activity in the brain during inhalation as compared with exhalation. Breathing in stimulates neurons in the olfactory cortex, hippocampus, and amygdala and all across the limbic system.

The olfactory system, or sense of smell, detects substances that are airborne, while an accessory system senses stimuli that are fluid-phased.   

The hippocampus is part of the limbic system that directs many bodily functions. Located near the center of the brain in the brain's medial temporal lobe, it is involved in the storage of long-term memory including past knowledge, events, facts, and experiences. It is not involved with short-term memory and procedural memory such as motor actions like walking.    

The amygdala is a cell mass located within the brain’s temporal lobes. There are two amygdalae with one in each hemisphere of the brain. The amygdala is a structure of the limbic system that is involved in emotional processing and motivations, particularly those that have to do with survival and the processing of emotions like anger, pleasure, and especially fear.  

The scientists studied seven epilepsy patients who were scheduled for brain surgery. A week before surgery, electrodes were implanted into the patients' brains to identify the origin of their seizures. This allowed the scientists to get electro-physiological data directly from the brains of the patients. The electrical signals indicated that brain activity fluctuated with breathing and occurred in areas of the brain where memory, emotions, and smells are processed. That discovery led scientists to wonder whether cognitive functions associated with these brain areas, particularly fear processing and memory, could also be affected by breathing.

One’s behavior is affected depending on whether a person inhales or exhales and whether breathing is done through the nose or through the mouth. Scientists had approximately 60 subjects make decisions rapidly on presented emotional expressions while their breathing was recorded. Study individuals identified a fearful face quicker when breathing in rather than breathing out. They were more likely to remember an object if they encountered it on the inhaled breath than on the exhaled one. The effect was specific to fearful stimuli only during nasal breathing and disappeared if the breathing was done through the mouth.

The findings also imply that rapid breathing may be an advantage when in a dangerous situation. In a panic state, breathing rhythm becomes faster. Proportionally more time is spent inhaling than when a person is in a calm state. The body's response to fear with faster breathing seems to have a positive impact on brain function and results in a faster response time to dangerous stimuli in the environment.


Nasal Respiration Entrains Human Limbic Oscillations and Modulates Cognitive Function.

Zelano C, Jiang H, Zhou G, Arora N, Schuele S, Rosenow J, Gottfried JA.

J Neurosci. 2016 Dec 7;36(49):12448-12467.

PMID: 27927961


The need to breathe links the mammalian olfactory system inextricably to the respiratory rhythms that draw air through the nose. In rodents and other small animals, slow oscillations of local field potential activity are driven at the rate of breathing (∼2-12 Hz) in olfactory bulb and cortex, and faster oscillatory bursts are coupled to specific phases of the respiratory cycle. These dynamic rhythms are thought to regulate cortical excitability and coordinate network interactions, helping to shape olfactory coding, memory, and behavior. However, while respiratory oscillations are a ubiquitous hallmark of olfactory system function in animals, direct evidence for such patterns is lacking in humans. In this study, we acquired intracranial EEG data from rare patients (Ps) with medically refractory epilepsy, enabling us to test the hypothesis that cortical oscillatory activity would be entrained to the human respiratory cycle, albeit at the much slower rhythm of ∼0.16-0.33 Hz. Our results reveal that natural breathing synchronizes electrical activity in human piriform (olfactory) cortex, as well as in limbic-related brain areas, including amygdala and hippocampus. Notably, oscillatory power peaked during inspiration and dissipated when breathing was diverted from nose to mouth. Parallel behavioral experiments showed that breathing phase enhances fear discrimination and memory retrieval. Our findings provide a unique framework for understanding the pivotal role of nasal breathing in coordinating neuronal oscillations to support stimulus processing and behavior.


Animal studies have long shown that olfactory oscillatory activity emerges in line with the natural rhythm of breathing, even in the absence of an odor stimulus. Whether the breathing cycle induces cortical oscillations in the human brain is poorly understood. In this study, we collected intracranial EEG data from rare patients with medically intractable epilepsy, and found evidence for respiratory entrainment of local field potential activity in human piriform cortex, amygdala, and hippocampus. These effects diminished when breathing was diverted to the mouth, highlighting the importance of nasal airflow for generating respiratory oscillations. Finally, behavioral data in healthy subjects suggest that breathing phase systematically influences cognitive tasks related to amygdala and hippocampal functions.


amygdala; hippocampus; local field potential; piriform cortex; respiration; respiratory oscillations


Turning White Fat to Brown - The Key to Obesity Treatment?

Posted on Dec. 16, 2016, 6 a.m. in Weight and Obesity Cellular Reprogramming

Scientists from the University of Pennsylvania believe that they have found the secret for turning "bad" white fat into "good" brown fat.

Senior author of the study, Dr. Zoltan P. Arany, related that he and his colleagues’ experiment resulted in deleting a gene in the white fat cells of mice. The gene, or protein, foliculin (FLCN) is a tumor suppressor. Once the gene was deleted, the protein TFE3 was able to enter the cells' nucleus. TFE3 would then bind to DNA. That activated a protein known as PGC-1ß which plays a major role in regulating cell metabolism.

Usually, that process does not occur, because TFE3 cannot enter the cell nucleus because two other genes, FCLN and mTOR, work to keep it out and keep the browning process switched off. When the FLCN was deleted in the mice, the white cells became browner. Those cells began producing more mitochondria, the oxygen reactors providing chemical energy inside the cells. In the brown fat cells, mitochondria convert energy into heat.

When deleting the gene, the white cells came to resemble the preferred brown cells. That process switched on a set of genes that changed the cells’ structures and boosted the ability of the mitochondria to consume oxygen and changed the patterns of gene expression.

The human body has different types of fat that fulfill different purposes.  If white fat cells, known as white adiposities, are filled with fat molecules, obesity can likely result. Brown fat cells, known as brown adipocytes, are what forms the "baby fat" in infants, who have much more brown fat than adults. Brown fat transfers the energy from food into heat, a process known as thermogenesis. The heat protects the body from cold, and the process of fat burning prevents obesity and related disorders such as diabetes, heart disease, and cancer.  

Dr. Arany states that there is still a long way to go and more research is needed, but the scientists are hopeful that this discovery will eventually lead the way to a new drug treatment that will prevent diabetes and reduce obesity by to pushing white fat to become brown fat.  


The tumor suppressor FLCN mediates an alternate mTOR pathway to regulate browning of adipose tissue.

Wada S, Neinast M, Jang C, Ibrahim YH, Lee G, Babu A, Li J, Hoshino A, Rowe GC, Rhee J, Martina JA, Puertollano R, Blenis J, Morley M, Baur JA, Seale P, Arany Z.

Genes Dev. 2016 Nov 15;30(22):2551-2564.

PMID: 27913603


Noncanonical mechanistic target of rapamycin (mTOR) pathways remain poorly understood. Mutations in the tumor suppressor folliculin (FLCN) cause Birt-Hogg-Dubé syndrome, a hamartomatous disease marked by mitochondria-rich kidney tumors. FLCN functionally interacts with mTOR and is expressed in most tissues, but its role in fat has not been explored. We show here that FLCN regulates adipose tissue browning via mTOR and the transcription factor TFE3. Adipose-specific deletion of FLCN relieves mTOR-dependent cytoplasmic retention of TFE3, leading to direct induction of the PGC-1 transcriptional coactivators, drivers of mitochondrial biogenesis and the browning program. Cytoplasmic retention of TFE3 by mTOR is sensitive to ambient amino acids, is independent of growth factor and tuberous sclerosis complex (TSC) signaling, is driven by RagC/D, and is separable from canonical mTOR signaling to S6K. Codeletion of TFE3 in adipose-specific FLCN knockout animals rescues adipose tissue browning, as does codeletion of PGC-1β. Conversely, inducible expression of PGC-1β in white adipose tissue is sufficient to induce beige fat gene expression in vivo. These data thus unveil a novel FLCN-mTOR-TFE3-PGC-1β pathway-separate from the canonical TSC-mTOR-S6K pathway-that regulates browning of adipose tissue.


FLCN; TFE3; adipose tissue; beige fat; mTOR; mitochondria

Edited by AlPater
Link to comment
Share on other sites

Chocolate intake and incidence of heart failure: Findings from the Cohort of Swedish Men.

Steinhaus DA, Mostofsky E, Levitan EB, Dorans KS, Håkansson N, Wolk A, Mittleman MA.

Am Heart J. 2017 Jan;183:18-23. doi: 10.1016/j.ahj.2016.10.002.

PMID: 27979037



The objective of this study was to evaluate the association of chocolate consumption and heart failure (HF) in a large population of Swedish men.


We conducted a prospective cohort study of 31,917 men 45-79 years old with no history of myocardial infarction, diabetes, or HF at baseline who were participants in the population-based Cohort of Swedish Men study. Chocolate consumption was assessed through a self-administrated food frequency questionnaire. Participants were followed for HF hospitalization or mortality from January 1, 1998, to December 31, 2011, using record linkage to the Swedish inpatient and cause-of-death registries.


During 14 years of follow-up, 2,157 men were hospitalized (n=1,901) or died from incident HF (n=256). Compared with subjects who reported no chocolate intake, the multivariable-adjusted rate ratio of HF was 0.88 (95% CI 0.78-0.99) for those consuming 1-3 servings per month, 0.83 (95% CI 0.72-0.94) for those consuming 1-2 servings per week, 0.82 (95% CI 0.68-0.99) for those consuming 3-6 servings per week, and 1.10 (95% CI 0.84-1.45) for those consuming ≥1 serving per day (P for quadratic trend=.001).


In this large prospective cohort study, there was a J-shaped relationship between chocolate consumption and HF incidence. Moderate chocolate consumption was associated with a lower rate of HF hospitalization or death, but the protective association was not observed among individuals consuming ≥1 serving per day.

Journal Subject Codes: Etiology: Epidemiology, Heart failure: Congestive.


Dietary tofu intake and long-term risk of death from stroke in a general population.

Nguyen HN, Miyagawa N, Miura K, Okuda N, Yoshita K, Arai Y, Nakagawa H, Sakata K, Ojima T, Kadota A, Takashima N, Fujiyoshi A, Ohkubo T, Abbott RD, Okamura T, Okayama A, Ueshima H; NIPPON DATA80 Research Group..

Clin Nutr. 2016 Dec 3. pii: S0261-5614(16)31337-1. doi: 10.1016/j.clnu.2016.11.021. [Epub ahead of print]

PMID: 27979412




Although dietary soy intake is linked with health benefits, a relation with stroke has not been established. The present study examined the association between the intake of tofu, the richest source of dietary soy, with stroke mortality in a general population cohort of Japanese men and women.


Data comprise 9244 Japanese enrolled in the National Nutrition Survey of Japan in 1980. Participants were free of cardiovascular disease and followed for 24 years. Dietary intake was estimated from 3-day weighed food records. Multivariable Cox regression models were used to estimate hazard ratios across levels of tofu intake.


During follow-up, there were 417 deaths due to stroke (88 cerebral hemorrhage [CH], 245 cerebral infarction [CI], and 84 of other subtypes). Among all men, and in women aged 65 years or more, tofu intake was unrelated to each form of stroke. For young women (<65 years of age), a significantly lower risk of CH in the top versus bottom quartile of tofu intake was observed (Multivariable-adjusted HR = 0.26, 95% CI: 0.08-0.85).


In this large prospective study with long follow-up of Japanese men and women, consumption of tofu was unrelated to the risk of stroke except for CH in women <65 years of age. Whether the association in younger women is real or due to chance alone warrants further study.


Cerebral hemorrhage; Cerebral infarction; Cohort study; Soy foods; Stroke mortality; Tofu


Pancreatic cancer risk in relation to sex, lifestyle factors, and pre-diagnostic anthropometry in the Malmö Diet and Cancer Study.

Andersson G, Wennersten C, Borgquist S, Jirström K.

Biol Sex Differ. 2016 Dec 9;7:66.

PMID: 27980714




Lifestyle factors may influence the risk of developing pancreatic cancer. Whereas cigarette smoking is an established risk factor, the effects of high alcohol intake and obesity are more uncertain. The aim of the present study was to examine the associations of pre-diagnostic anthropometry, alcohol consumption, and smoking habits with pancreatic cancer risk in a Swedish prospective, population-based cohort, with particular reference to potential sex differences.


The studied cohort consists of 28,098 participants, including all incident cases of pancreatic cancer, in the Malmö Diet and Cancer Study up until December 31, 2013 (n = 163). Non-parametric and chi-squared tests were applied to compare the distribution of risk factors between cases and non-cases. Cox regression proportional hazards models were used to estimate the relationship between investigative factors and pancreatic cancer risk. Anthropometric factors included height, weight, body mass index (BMI), waist and hip circumference, waist-hip ratio (WHR), and body fat percentage.


BMI was not a significant risk factor for pancreatic cancer, but a higher WHR was significantly associated with an increased risk in the entire cohort (hazard ratio (HR) 2.36, 95% confidence interval (CI) 1.28-4.35, p for trend = 0.009). Regular smoking was a significant risk factor among both women (HR 2.62, 95% CI 1.61-4.27) and men (HR 3.57, 95% CI 1.70-7.47), whereas occasional smoking was a significant risk factor only in women (HR 3.29, 95% CI 1.50-7.19). Passive smoking at work for >20 years was significantly associated with an increased risk in the entire cohort (HR 1.73, 95% CI 1.15-2.58) and in women selectively (HR 2.01, 95% CI 1.21-3.31). Alcohol consumption was not a significant risk factor. A significant interaction was found between female sex and age (p = 0.045), but no other factor, in relation to pancreatic cancer risk.


WHR was the only pre-diagnostic anthropometric factor associated with pancreatic cancer risk, with no sex-related differences. Regular smoking was confirmed as a significant risk factor in both sexes, whereas occasional and passive smoking were significant risk factors only in women. Despite the lack of a significant interaction between smoking and sex in relation with pancreatic cancer risk, potential sex differences should be considered in future epidemiological studies.


Alcohol; Lifestyle; Obesity; Pancreatic cancer risk; Smoking


Maternal vitamin D status during pregnancy and risk of childhood asthma -- a meta-analysis of prospective studies.

Song H, Yang L, Jia C.

Mol Nutr Food Res. 2016 Dec 16. doi: 10.1002/mnfr.201600657. [Epub ahead of print]

PMID: 27981740






Mounting evidence suggests that maternal vitamin D status during pregnancy may be associated with development of childhood asthma, but the results are still inconsistent. A dose-response meta-analysis was performed to quantitatively summarize evidence on the association of maternal vitamin D status during pregnancy with the risk of childhood asthma.


A systematic search was conducted to identify all studies assessing the association of maternal 25-hydroxyvitamin D (25(OH)D) during pregnancy with risk of childhood asthma. The fixed or random-effect model was selected based on the heterogeneity test among studies. Nonlinear dose-response relationship was assessed by restricted cubic spline model. Fifteen prospective studies with 12,758 participants and 1,795 cases were included in the meta-analysis. The pooled RR of childhood asthma comparing the highest versus lowest category of maternal 25(OH)D levels was 0.87 (95% confidence interval, CI, 0.75-1.02). For dose-response analysis, evidence of a U-shaped relationship was found between maternal 25(OH)D levels and risk of childhood asthma (P-nonlinearity = 0.02), with the lowest risk at approximately 70 nmol/L of 25(OH)D.


This dose-response meta-analysis suggested a U-shaped relationship between maternal blood 25(OH)D levels and risk of childhood asthma. Further studies are needed to confirm the association. This article is protected by copyright. All rights reserved.


Childhood asthma; Maternal; Meta-analysis; Vitamin D


Abdominal Obesity and Lung Cancer Risk: Systematic Review and Meta-Analysis of Prospective Studies.

Hidayat K, Du X, Chen G, Shi M, Shi B.

Nutrients. 2016 Dec 15;8(12). pii: E810.

PMID: 27983672



Several meta-analyses of observational studies have been performed to examine the association between general obesity, as measured by body mass index (BMI), and lung cancer. These meta-analyses suggest an inverse relation between high BMI and this cancer. In contrast to general obesity, abdominal obesity appears to play a role in the development of lung cancer. However, the association between abdominal obesity (as measured by waist circumference (WC) (BMI adjusted) and waist to hip ratio (WHR)) and lung cancer is not fully understood due to sparse available evidence regarding this association. PubMed and Web of Science databases were searched for studies assessing the association between abdominal obesity and lung cancer up to October 2016. The summary relative risks (RRs) with 95% confidence intervals (CIs) were calculated with a random-effects model. Six prospective cohort studies with 5827 lung cancer cases among 831,535 participants were included in our meta-analysis. Each 10 cm increase in WC and 0.1 unit increase in WHR were associated with 10% (RR 1.10; 95% CI 1.04, 1.17; I² = 27.7%, p-heterogeneity = 0.198) and 5% (RR 1.05; 95% CI 1.00, 1.11; I² = 25.2%, p-heterogeneity = 0.211) greater risks of lung cancer, respectively. According to smoking status, greater WHR was only positively associated with lung cancer among former smokers (RR 1.11; 95% CI 1.00, 1.23). In contrast, greater WC was associated with increased lung cancer risk among never smokers (RR 1.11; 95% CI 1.00, 1.23), former smokers (RR 1.12; 95% CI 1.03, 1.22) and current smokers (RR 1.16; 95% CI 1.08, 1.25). The summary RRs for highest versus lowest categories of WC and WHR were 1.32 (95% CI 1.13, 1.54; I² = 18.2%, p-heterogeneity = 0.281) and 1.10 (95% CI 1.00, 1.23; I² = 24.2%, p-heterogeneity = 0.211), respectively. In summary, abdominal obesity may play an important role in the development of lung cancer.


abdominal obesity; central obesity; dose-response; lung cancer; waist circumference; waist to hip ratio


[The below paper is pdf-availed.]

Gamma-glutamyltransferase, fatty liver index and hepatic insulin resistance are associated with incident hypertension in two longitudinal studies.

Bonnet F, Gastaldelli A, Pihan-Le Bars F, Natali A, Roussel R, Petrie J, Tichet J, Marre M, Fromenty B, Balkau B; D.E.S.I.R., RISC Study Groups..

J Hypertens. 2016 Dec 14. [Epub ahead of print]

PMID: 27984413



We hypothesized that liver markers and the fatty liver index (FLI) are predictive of incident hypertension and that hepatic insulin resistance plays a role.


The association between liver markers and incident hypertension was analysed in two longitudinal studies of normotensive individuals, 2565 from the 9-year data from an epidemiological study on the insulin resistance cohort and the 321 from the 3-year 'Relationship between Insulin Sensitivity and Cardiovascular disease' cohort who had a measure of endogenous glucose production. The FLI is calculated from BMI, waist circumference, triglycerides and gamma-glutamyltransferase (GGT) and the hepatic insulin resistance index from endogenous glucose production and fasting insulin.


The incidence of hypertension increased across the quartiles groups of both baseline GGT and alanine aminotransferase. After adjustment for sex, age, waist circumference, fasting glucose, smoking and alcohol intake, only GGT was significantly related with incident hypertension [standardized odds ratio: 1.21; 95% confidence interval (1.10-1.34); P = 0.0001]. The change in GGT levels over the follow-up was also related with an increased risk of hypertension, independently of changes in body weight. FLI analysed as a continuous value, or FLI at least 60 at baseline were predictive of incident hypertension in the multivariable model. In the RISC cohort, the hepatic insulin resistance index was positively related with the risk of 3-year incident hypertension [standardized odds ratio: 1.54 (1.07-2.22); P = 0.02].


Baseline GGT and FLI, as well as an increase in GGT over time, were associated with the risk of incident hypertension. Enhanced hepatic insulin resistance predicted the onset of hypertension and may be a link between liver markers and hypertension.


Caffeine intake is associated with pupil dilation and enhanced accommodation.

Abokyi S, Owusu-Mensah J, Osei KA.

Eye (Lond). 2016 Dec 16. doi: 10.1038/eye.2016.288. [Epub ahead of print]

PMID: 27983733




It is purported that caffeine, an autonomic stimulant, affects visual performance. This study sought to assess whether caffeine intake was associated with changes in pupil size and/or amplitude of accommodation.Patients and methodsA double-masked, crossover study was conducted in 50 healthy subjects of age range 19 to 25 years. Subjects were randomized to treatments such that subjects consumed either 250 mg caffeine drink or vehicle on separate days. Amplitude of accommodation was measured by the push-up technique, and pupil size using a millimeter ruler fixed to a slit lamp biomicroscope in dim illumination (5 lux). Amplitude of accommodation and pupil size were taken at baseline, and at 30, 60 and 90 min time points post treatment. Repeated measures one-way ANOVA and paired t-test were used in analyzing data.ResultsAmplitude of accommodation and pupil size after caffeine intake were significantly greater than vehicle (P<0.001) at each time point. Consumption of the caffeine beverage was associated with significant increases in amplitude of accommodation and pupil size with time (P<0.001). Amplitude of accommodation rose from 12.4 (±2.2 D) at baseline to 15.8(±2.6 D) at 90 min. Similarly, pupil size increased from 3.4 (±0.4 mm) at baseline to 4.5 (±0.72 mm) at 90 min. Consumption of vehicle was not associated with increase in amplitude of accommodation or pupil size with time.ConclusionPupil size and accommodation are affected after ingestion of caffeine. This study suggests caffeine may have some influence on visual functions.


Predictors of Mortality in People with Recent-onset Gout: A Prospective Observational Study.

Vincent ZL, Gamble G, House M, Knight J, Horne A, Taylor WJ, Dalbeth N.

J Rheumatol. 2016 Dec 15. pii: jrheum.160596. [Epub ahead of print]

PMID: 27980010




To determine mortality rates and predictors of death at baseline in people with a recent onset of gout.


People with gout disease duration < 10 years were recruited from primary and secondary care settings. Comprehensive clinical assessment was completed at baseline. Participants were prospectively followed for at least 1 year. Information about death was systematically collected from primary and secondary health records. Standardized mortality ratios (SMR) were calculated and risk factors for mortality were analyzed using Cox proportional hazard regression models.


The mean (SD) followup duration was 5.1 (1.6) years (a total 1511 patient-yrs accrued). Of the 295 participants, 43 (14.6%) had died at the time of censorship (SMR 1.96, 95% CI 1.44-2.62). In the reduced Cox proportional hazards model, these factors were independently associated with an increased risk of death from all causes: older age (70-80 yrs: HR 9.96, 95% CI 3.30-30.03; 80-91 yrs: HR 9.39, 95% CI 2.68-32.89), Māori or Pacific ethnicity (HR 2.48, 95% CI 1.17-5.29), loop diuretic use (HR 3.99, 95% CI 2.15-7.40), serum creatinine (per 10 μmol/l change; HR 1.04, 95% CI 1.00-1.07), and the presence of subcutaneous tophi (HR 2.85, 95% CI 1.49-5.44). The presence of subcutaneous tophi was the only baseline variable independently associated with both cardiovascular (CV) cause of death (HR 3.13, 95% CI 1.38-7.10) and non-CV cause of death (HR 3.48, 95% CI 1.25-9.63).


People with gout disease duration < 10 years have an increased risk of death. The presence of subcutaneous tophi at baseline is an independent predictor of mortality, from both CV and non-CV causes.

Edited by AlPater
Link to comment
Share on other sites

The Obesity Paradox in Cancer: a Review.

Lennon H, Sperrin M, Badrick E, Renehan AG.

Curr Oncol Rep. 2016 Sep;18(9):56. doi: 10.1007/s11912-016-0539-4. Review.

PMID: 27475805 Free PMC Article




There is a common perception that excess adiposity, commonly approximated by body mass index (BMI), is associated with reduced cancer survival. A number of studies have emerged challenging this by demonstrating that overweight and early obese states are associated with improved survival. This finding is termed the "obesity paradox" and is well recognized in the cardio-metabolic literature but less so in oncology. Here, we summarize the epidemiological findings related to the obesity paradox in cancer. Our review highlights that many observations of the obesity paradox in cancer reflect methodological mechanisms including the crudeness of BMI as an obesity measure, confounding, detection bias, reverse causality, and a specific form of the selection bias, known as collider bias. It is imperative for the oncologist to interpret the observation of the obesity paradox against the above methodological framework and avoid the misinterpretation that being obese might be "good" or "protective" for cancer patients.


Adiposity; BMI; Body mass index; Cancer; Cancer survival; Epidemiology; Excess weight; Mortality; Obesity; Overweight; Prognosis


Association between Body Mass Index and Cancer Survival in a Pooled Analysis of 22 Clinical Trials.

Greenlee H, Unger JM, LeBlanc M, Ramsey S, Hershman DL.

Cancer Epidemiol Biomarkers Prev. 2016 Dec 16. [Epub ahead of print]

PMID: 27986655




Data are inconsistent on the association between body mass index (BMI) at time of cancer diagnosis and prognosis. We used data from 22 clinical treatment trials to examine the association between BMI and survival across multiple cancer types and stages.


Trials with ≥5 years of follow-up were selected. Patients with BMI < 18.5 kg/m2 were excluded. Within a disease, analyses were limited to patients on similar treatment regimens. Variable cutpoint analysis identified a BMI cutpoint that maximized differences in survival. Multivariable Cox regression analyses compared survival between patients with BMI above versus below the cutpoint, adjusting for age, race, sex, and important disease-specific clinical prognostic factors.


A total of 11,724 patients from 22 trials were identified. Fourteen analyses were performed by disease site and treatment regimen. A cutpoint of BMI = 25 kg/m2 maximized survival differences. No statistically significant trend across all 14 analyses was observed (mean HR = 0.96; P = 0.06). In no cancer/treatment combination was elevated BMI associated with an increased risk of death; for some cancers there was a survival advantage for higher BMI. In sex-stratified analyses, BMI ≥ 25 kg/m2 was associated with better overall survival among men (HR = 0.82; P = 0.003), but not women (HR = 1.04; P = 0.86). The association persisted when sex-specific cancers were excluded, when treatment regimens were restricted to dose based on body surface area, and when early-stage cancers were excluded.


The association between BMI and survival is not consistent across cancer types and stages.


Our findings suggest that disease, stage, and gender-specific body size recommendations for cancer survivors may be warranted.


Association of Weight Change after Colorectal Cancer Diagnosis and Outcomes in the Kaiser Permanente Northern California Population.

Meyerhardt JA, Kroenke CH, Prado CM, Kwan ML, Castillo A, Weltzien E, Cespedes EM, Xiao J, Caan BJ.

Cancer Epidemiol Biomarkers Prev. 2016 Dec 16. [Epub ahead of print]

PMID: 27986654




Higher body mass index (BMI) is associated with incident colorectal cancer but not consistently with colorectal cancer survival. Whether weight gain or loss is associated with colorectal cancer survival is largely unknown.


We identified 2,781 patients from Kaiser Permanente Northern California diagnosed with stages I-III colorectal cancer between 2006 and 2011 with weight and height measurements within 3 months of diagnosis and approximately 18 months after diagnosis. We evaluated associations between weight change and colorectal cancer-specific and overall mortality, adjusted for sociodemographics, disease severity, and treatment.


After completion of treatment and recovery from stage I-III colorectal cancer, loss of at least 10% of baseline weight was associated with significantly worse colorectal cancer-specific mortality (HR 3.20; 95% confidence interval [CI], 2.33-4.39; P trend < 0.0001) and overall mortality (HR 3.27; 95% CI, 2.56-4.18; P trend < 0.0001). For every 5% loss of baseline weight, there was a 41% increased risk of colorectal cancer-specific mortality (95% CI, 29%-56%). Weight gain was not significantly associated with colorectal cancer-specific mortality (P trend = 0.54) or overall mortality (P trend = 0.27). The associations were largely unchanged after restricting analyses to exclude patients who died within 6 months and 12 months of the second weight measurement. No significant interactions were demonstrated for weight loss or gain by gender, stage, primary tumor location, or baseline BMI.


Weight loss after diagnosis was associated with worse colorectal cancer-specific mortality and overall mortality. Reverse causation does not appear to explain our findings.


Understanding mechanistic underpinnings for the association of weight to worse mortality is important to improving patient outcomes.


Lower Pectoralis Muscle Area Is Associated with a Worse Overall Survival in Non-Small Cell Lung Cancer.

Kinsey CM, San José Estépar R, van der Velden J, Cole BF, Christiani DC, Washko GR.

Cancer Epidemiol Biomarkers Prev. 2016 May 19. [Epub ahead of print]

PMID: 27197281




Muscle wasting is a component of the diagnosis of cancer cachexia and has been associated with poor prognosis. However, recommended tools to measure sarcopenia are limited by poor sensitivity or the need to perform additional scans. We hypothesized that pectoralis muscle area (PMA) measured objectively on chest CT scan may be associated with overall survival (OS) in non-small cell lung cancer (NSCLC).


We evaluated 252 cases from a prospectively enrolling lung cancer cohort. Eligible cases had CT scans performed prior to the initiation of surgery, radiation, or chemotherapy. PMA was measured in a semi-automated fashion while blinded to characteristics of the tumor, lung, and patient outcomes.


Men had a significantly greater PMA than women (37.59 vs. 26.19 cm2, P < 0.0001). In univariate analysis, PMA was associated with age and body mass index (BMI). A Cox proportional hazards model was constructed to account for confounders associated with survival. Lower pectoralis area (per cm2) at diagnosis was associated with an increased hazard of death of 2% (HRadj, 0.98; confidence interval, 0.96-0.99; P = 0.044) while adjusting for age, sex, smoking, chronic bronchitis, emphysema, histology, stage, chemotherapy, radiation, surgery, BMI, and ECOG performance status.


Lower PMA measured from chest CT scans obtained at the time of diagnosis of NSCLC is associated with a worse OS.


PMA may be a valuable CT biomarker for sarcopenia-associated lung cancer survival.


Postdiagnosis Weight Change and Survival Following a Diagnosis of Early-Stage Breast Cancer.

Cespedes Feliciano EM, Kroenke CH, Bradshaw PT, Chen WY, Prado CM, Weltzien EK, Castillo AL, Caan BJ.

Cancer Epidemiol Biomarkers Prev. 2016 Aug 26. [Epub ahead of print]

PMID: 27566419




Achieving a healthy weight is recommended for all breast cancer survivors. Previous research on postdiagnosis weight change and mortality had conflicting results.


We examined whether change in body weight in the 18 months following diagnosis is associated with overall and breast cancer-specific mortality in a cohort of n = 12,590 stage I-III breast cancer patients at Kaiser Permanente using multivariable-adjusted Cox regression models. Follow-up was from the date of the postdiagnosis weight at 18 months until death or June 2015 [median follow-up (range): 3 (0-9) years]. We divided follow-up into earlier (18-54 months) and later (>54 months) postdiagnosis periods.


Mean (SD) age-at-diagnosis was 59 (11) years. A total of 980 women died, 503 from breast cancer. Most women maintained weight within 5% of diagnosis body weight; weight loss and gain were equally common at 19% each. Compared with weight maintenance, large losses (≥10%) were associated with worse survival, with HRs and 95% confidence intervals (CI) for all-cause death of 2.63 (2.12-3.26) earlier and 1.60 (1.14-2.25) later in follow-up. Modest losses (>5%-<10%) were associated with worse survival earlier [1.39 (1.11-1.74)] but not later in follow-up [0.77 (0.54-1.11)]. Weight gain was not related to survival. Results were similar for breast cancer-specific death.


Large postdiagnosis weight loss is associated with worse survival in both earlier and later postdiagnosis periods, independent of treatment and prognostic factors.


Weight loss and gain are equally common after breast cancer, and weight loss is a consistent marker of mortality risk.


A Stress-Resistant Lipidomic Signature Confers Extreme Longevity to Humans.

Jové M, Naudí A, Gambini J, Borras C, Cabré R, Portero-Otín M, Viña J, Pamplona R.

J Gerontol A Biol Sci Med Sci. 2017 Jan;72(1):30-37.

PMID: 27013396



Plasma lipidomic profile is species specific and an optimized feature associated with animal longevity. In the present work, the use of mass spectrometry technologies allowed us to determine the plasma lipidomic profile and the fatty acid pattern of healthy humans with exceptional longevity. Here, we show that it is possible to define a lipidomic signature only using 20 lipid species to discriminate adult, aged and centenarian subjects obtaining an almost perfect accuracy (90%-100%). Furthermore, we propose specific lipid species belonging to ceramides, widely involved in cell-stress response, as biomarkers of extreme human longevity. In addition, we also show that extreme longevity presents a fatty acid profile resistant to lipid peroxidation. Our findings indicate that lipidomic signature is an optimized feature associated with extreme human longevity. Further, specific lipid molecular species and lipid unsaturation arose as potential biomarkers of longevity.


Centenarians; Fatty acid unsaturation; Lipid molecular species; Mass spectrometry; Peroxidizability index


Simple Test of Manual Dexterity Can Help to Identify Persons at High Risk for Neurodegenerative Diseases in the Community.

Darweesh SK, Wolters FJ, Hofman A, Stricker BH, Koudstaal PJ, Ikram MA.

J Gerontol A Biol Sci Med Sci. 2017 Jan;72(1):75-81.

PMID: 27371953




Early identification of individuals at high risk of developing neurodegenerative diseases is essential for timely preventive intervention. However, simple methods that can be used for risk assessment in general practice are lacking.


Within the population-based Rotterdam Study, we used the Purdue Pegboard Test (PPT) to assess manual dexterity in 4,856 persons (median age 70 years, 58% women) free of parkinsonism and dementia between 2000 and 2004. We followed these persons until January 1, 2012 for the onset of neurodegenerative diseases (defined as first diagnosis of parkinsonism or dementia). We determined the association of PPT scores with incident neurodegenerative disease, adjusting for age, sex, study cohort, level of education, smoking, preferred hand, parental history, memory complaints, and Mini-Mental State Examination. Furthermore, we determined the incremental predictive value of PPT, expressed as change in risk classification and discrimination.


During follow-up (median 9.2 years), 277 participants were diagnosed with a neurodegenerative disease (227 with dementia and 50 with parkinsonism). Lower PPT scores were associated with higher risk of incident neurodegenerative diseases (hazard ratio {HR} = 1.28, 95% confidence interval [CI]: 1.18-1.41) and improved discrimination of incident neurodegenerative diseases. We also observed significant associations of PPT scores separately with incident dementia (HR = 1.25; 95% CI: 1.14-1.39]) and incident parkinsonism (HR = 1.41; 95% CI: 1.19-1.67).


A rapid, nonlaboratory test of manual dexterity may help to identify persons at high risk for neurodegenerative diseases. This highlights the importance of motor function in the preclinical phase of both dementia and parkinsonism and may aid in selecting individuals for refined screening and neuroprotective trials.


Dementia; Parkinsonism; Preclinical study; Prediction


Effects of Age and Functional Status on the Relationship of Systolic Blood Pressure With Mortality in Mid and Late Life: The ARIC Study.

Windham BG, Griswold ME, Lirette S, Kucharska-Newton A, Foraker RE, Rosamond W, Coresh J, Kritchevsky S, Mosley TH Jr.

J Gerontol A Biol Sci Med Sci. 2017 Jan;72(1):89-94.

PMID: 26409066




Impaired functional status attenuates the relationship of systolic blood pressure (SBP) with mortality in older adults but has not been studied in middle-aged populations.


Among 10,264 stroke-free Atherosclerosis Risk in Communities participants (mean age 62.8 [5.7] years; 6,349 [62%] younger [<65 years]; 5,148 [50%] men; 2,664 [26%] Black), function was defined as good function (GF) for those self-reporting no difficulty performing functional tasks and basic or instrumental tasks of daily living; all others were defined as impaired function (IF). SBP categories were normal (<120 mmHg), prehypertension (120-139 mmHg), and hypertension (≥140 mmHg). Mortality risk associated with SBP was estimated using adjusted Cox proportional hazard models with a triple interaction between age, functional status, and SBP.


Mean follow-up was 12.9 years with 2,863 (28%) deaths. Among younger participants, 3,017 (48%) had IF; 2,279 of 3,915 (58%) older participants had IF. Prehypertension (hazard ratio {HR} = 1.48 [1.03, 2.15] p = .04) and hypertension (HR = 1.97 [1.29, 3.03] p = .002) were associated with mortality in younger GF and older (≥65 years) GF participants (prehypertension HR = 1.21 [1.06, 1.37] p = .005; hypertension HR = 1.47 [1.36, 1.59] p < .001). Among IF participants, prehypertension was not associated with mortality in younger participants (HR = 0.99 [0.85, 1.15] p = .93) and was protective in older participants (HR = 0.87 [0.85, 0.90] p < .001). Hypertension was associated with mortality in younger IF participants (HR = 1.54 [1.30, 1.82] p < .001) but not in older IF participants (HR = 0.99 [0.87, 1.14] p = .93).


Compared with younger and well-functioning persons, the additional contribution of blood pressure to mortality is much lower with older age and impaired function, particularly if both are present. Functional status and age could potentially inform optimal blood pressure targets.


Blood pressure; Functional status.; Middle aged; Mortality


The Survival of Spouses Marrying Into Longevity-Enriched Families.

Pedersen JK, Elo IT, Schupf N, Perls TT, Stallard E, Yashin AI, Christensen K.

J Gerontol A Biol Sci Med Sci. 2017 Jan;72(1):109-114.

PMID: 27540092




Studies of longevity-enriched families are an important tool to gain insight into the mechanisms of exceptionally long and healthy lives. In the Long Life Family Study, the spouses of the members of the longevity-enriched families are often used as a control group. These spouses could be expected to have better health than the background population due to shared family environment with the longevity-enriched family members and due to assortative mating.


A Danish cohort study of 5,363 offspring of long-lived siblings, born 1917-1982, and 4,498 "first spouses" of these offspring. For each offspring and spouse, 10 controls were drawn from a 5% random sample of the Danish population matched on birth year and sex. Mortality was assessed for ages 20-69 years during 1968-2013 based on prospectively collected registry data.


During the 45-year follow-up period, 437 offspring deaths and 502 offspring spouse deaths were observed. Compared with the background population, the hazard ratio for male offspring was 0.44 (95% confidence interval [CI]: 0.38-0.50) and for female offspring it was 0.57 (95% CI: 0.49-0.66). For male spouses, the hazard ratio was 0.66 (95% CI: 0.59-0.74), whereas for female spouses it was 0.64 (95% CI: 0.54-0.76). Sensitivity analyses in restricted samples gave similar results.


The mortality for ages 20-69 years of spouses marrying into longevity-enriched families is substantially lower than the mortality in the background population, although long-lived siblings participation bias may have contributed to the difference. This finding has implications for the use of spouses as controls in healthy aging and longevity studies, as environmental and/or genetic overmatching may occur.


Long-lived families; Mortality; Offspring; Spousal overmatching


Association Between Sleep Characteristics and Incident Dementia Accounting for Baseline Cognitive Status: A Prospective Population-Based Study.

Bokenberger K, Ström P, Dahl Aslan AK, Johansson AL, Gatz M, Pedersen NL, Åkerstedt T.

J Gerontol A Biol Sci Med Sci. 2017 Jan;72(1):134-139.

PMID: 27402049




Although research has shown that sleep disorders are prevalent among people with dementia, the temporal relationship is unclear. We investigated whether atypical sleep characteristics were associated with incident dementia while accounting for baseline cognitive functioning.


Screening Across the Lifespan Twin (SALT) study participants were 11,247 individuals from the Swedish Twin Registry who were at least 65 years at baseline (1998-2002). Sleep and baseline cognitive functioning were assessed via the SALT telephone screening interview. Data on dementia diagnoses came from national health registers. Cox regression was performed to estimate hazard ratios for dementia.


After 17 years of follow-up, 1,850 dementia cases were identified. Short (≤6 hours) and extended (>9 hours) time in bed (TIB) compared to the middle reference group (hazard ratio = 1.40, 95% confidence interval = 1.06-1.85; hazard ratio = 1.11, 95% confidence interval = 1.00-1.24, respectively) and rising at 8:00 AM or later compared to earlier rising (hazard ratio = 1.12, 95% confidence interval = 1.01-1.24) were associated with higher dementia incidence. Bedtime, sleep quality, restorative sleep, and heavy snoring were not significant predictors. Findings stratified by baseline cognitive status indicated that the association between short TIB and dementia remained in those cognitively intact at the start.


Short and extended TIB and delayed rising among older adults predicted increased dementia incidence in the following 17 years. The pattern of findings suggests that extended TIB and late rising represent prodromal features whereas short TIB appeared to be a risk factor for dementia.


Cognitive impairment; Dementia; Prodromal sign; Risk factor; Sleep characteristics

Edited by AlPater
Link to comment
Share on other sites

Should We Screen for Vitamin D Deficiency?: Grand Rounds Discussion From Beth Israel Deaconess Medical Center.

Libman H, Malabanan AO, Strewler GJ, Reynolds EE.

Ann Intern Med. 2016 Dec 6;165(11):800-807. doi: 10.7326/M16-1993.

PMID: 27919096



The U.S. Preventive Services Task Force (USPSTF) recently issued guidelines on screening for vitamin D deficiency. The guidelines were based on randomized trials of vitamin D deficiency screening and treatment, as well as on case-control studies nested within the Women's Health Initiative. The USPSTF concluded that current evidence is insufficient to assess the benefits and harms of screening for vitamin D deficiency in asymptomatic adults. Compared with placebo or no treatment, vitamin D was associated with decreased mortality; however, benefits were no longer seen after trials of institutionalized persons were excluded. Vitamin D treatment was associated with a possible decreased risk for at least 1 fall and the total number of falls per person but not for fractures. None of the studies examined the effects of vitamin D screening versus not screening on clinical outcomes. In this Grand Rounds, 2 prominent endocrinologists debate the issue of screening for vitamin D deficiency in a 55-year-old, asymptomatic, postmenopausal woman. They review the data on which the USPSTF recommendations are based and discuss the potential benefits and risks, as well as the challenges and controversies, of screening for vitamin D deficiency in primary care practice.

Link to comment
Share on other sites

The Safety Limits Of An Extended Fast: Lessons from a Non-Model Organism.

Bertile F, Fouillen L, Wasselin T, Maes P, Le Maho Y, Van Dorsselaer A, Raclot T.

Sci Rep. 2016 Dec 19;6:39008. doi: 10.1038/srep39008.

PMID: 27991520



While safety of fasting therapy is debated in humans, extended fasting occurs routinely and safely in wild animals. To do so, food deprived animals like breeding penguins anticipate the critical limit of fasting by resuming feeding. To date, however, no molecular indices of the physiological state that links spontaneous refeeding behaviour with fasting limits had been identified. Blood proteomics and physiological data reveal here that fasting-induced body protein depletion is not unsafe "per se". Indeed, incubating penguins only abandon their chick/egg to refeed when this state is associated with metabolic defects in glucose homeostasis/fatty acid utilization, insulin production and action, and possible renal dysfunctions. Our data illustrate how the field investigation of "exotic" models can be a unique source of information, with possible biomedical interest.


Guidelines to Limit Added Sugar Intake: Junk Science or Junk Food?

Schillinger D, Kearns C.

Ann Intern Med. 2016 Dec 20. doi: 10.7326/M16-2754. [Epub ahead of print] No abstract available.

PMID: 27992900


When it comes to added sugars, there are clear

conflicts between public health interests and the

interests of the food and beverage (F&B) industry. Studies

are more likely to conclude there is no relationship

between sugar consumption and health outcomes

when investigators receive financial support from F&B

companies (1). Industry documents show that the F&B

industry has manipulated research on sugars for public

relations purposes (2). Erickson and colleagues report a

systematic review of the scientific basis of guidelines on

sugar intake, providing another occasion for concern

about these conflicts (3). The review examined 9 guidelines

that offered 12 recommendations on sugar consumption.

It concluded that the guidelines do not meet

criteria for trustworthy recommendations, judging the

evidence supporting each recommendation to be of

low quality.

It is important to note that the North American

branch of the International Life Sciences Institute

(ILSI North America) funded the review. ILSI North

America is a trade group representing The Coca-Cola

Company; Dr Pepper Snapple Group; The Hershey

Company; Mars, Inc.; Nestle´ USA; and PepsiCo, among

others (4). In essence, this study suggests that placing

limits on “junk food” is based on “junk science,” a conclusion

favorable to the F&B industry. The implication is

that the organizations producing the guidelines may

have overstated the strength of the evidence on added

sugars due to their belief that overconsumption has

negative health effects. Although this study involves the

F&B industry, similar claims were made by the tobacco

industry in its attempt to discredit evidence on the

harms of tobacco, including passive smoking (5).

Because of potential conflicts of interest, we believe

it is important to recognize shortcomings of the

review. First, the authors used the inconsistency of recommendations

across guidelines as a rationale to raise

concern about the quality of the guidelines and to signal

the need for a review. However, the guidelines examined

were issued between 1995 and 2016; one

would expect recommendations spanning more than 2

decades to evolve as scientific knowledge evolved. The

most recent guidelines from Public Health England, the

World Health Organization (WHO), and the U.S. Department

of Agriculture show remarkable consistency,

recommending limits ranging from less than 5% to less

than 10% of daily calories from sugar intake. The outlier

was the 2002 Institute of Medicine guideline, which was

partly funded by ILSI North America. This further calls

into question the claim that variation in recommendations

is a reason to examine the trustworthiness of

sugar guidelines.

Second, the review considers the funding source to

be a characteristic determining the trustworthiness of a

guideline. Yet, the authors do not comment on the fact

that the aforementioned Institute of Medicine guideline,

partly funded by ILSI, set a maximal intake level for

added sugars at 25%, well above average consumption

levels. They also described as “unclear” the funding of

the Dietary Guidelines for Americans (DGA) (which recommended

limiting sugars to <10% of calories), questioning

its editorial independence. This assessment is

curious: The review's appendix acknowledges that the

DGA is federally sponsored and that advisory committee

members were thoroughly vetted for conflicts per

federal rules.

Third, use of the Appraisal of Guidelines for Research

and Evaluation, 2nd edition (AGREE II) instrument

to assess guideline quality guaranteed ratings of

poor quality. AGREE II is designed for clinical practice

guidelines in the treatment of illness (6). The objective

of dietary guidelines is to assess risks of consumption at

the population level, not to evaluate interventions to

reduce consumption. Using this tool, the authors downgraded

the trustworthiness of guidelines because ways

to limit sugar intake “were not clearly presented” and

because “likely barriers to and facilitators of implementation”

were not discussed. They also created de novo

an overall guideline quality score of 1 to 7, with interrater

differences of 3 points permitted, yet did not report

reliability of this score.

Fourth, the review's unconventional application of

the GRADE (Grading of Recommendations Assessment,

Development and Evaluation) system to evaluate

quality of evidence for guidelines is problematic. The

WHO already used GRADE to rate evidence as

moderate-quality. The ILSI-funded review regraded

some evidence in favor of its sponsor. Evidence on

body weight, which the WHO judged to have “no serious

inconsistencies,” was inexplicably downgraded for

inconsistency. The current review also disagreed with

the WHO on dental caries evidence, determining that

large effect sizes from observational studies should not

be considered when judging quality.

Finally, the authors claimed that the f ood pattern

modeling and U.S. national caloric data used to i nform

the DGA are not publicly available, prohibiting them

from applying GRADE to assess evidence quality. This

statement represents a serious oversight that undermines

the review's conclusions that “using the GRADE

approach, we f ound that the overall quality of evidence

to support recommendations was l ow to very low.” Although

they acknowledged the “extensive scientific report”

commissioned to support the DGA, they ignored

that the methods used to assess dietary patterns are

describerd in detail in i ts readily accessible Appendix

E-3.7, together with a 500-page supporting report, “A

Series of Systematic Reviews on the Relationship

Between Dietary Patterns and Health Outcomes” (7),

conducted by the U.S. Department of Agriculture's

Nutrition Evidence Library. The supporting report

descirbes the Nutrition Evidence Library's compliance

with the Consolidated Appropriations Act of 2001,

which mandates federal agencies to ensure the quality,

objectivity, utility, and integrity of the information used

to form federal guidance.

ILSI North America presents itself as “a public, nonprofit

scientific foundation that advances the understanding

and application of science related to the nutritional

quality and safety of the food supply” (4).

Although scrutiny of dietary guidelines is warranted, we

believe that this review is an example of the “politicization

of science” (8). Politicization occurs when an actor

overly accentuates inherent uncertainties of science to

cast doubt on the scientific consensus. In pursuit of an

agenda, the actor creates messages to stunt the impact

of persuasive information from a credible source. Might

ILSI North America's agenda in funding this review

have been to undermine recent guidelines recommending

limits on sugar consumption? We believe that

it is prudent to consider this possibility, given ILSI's history

of opposition to quantitative sugar guidelines. In

1989, the WHO recommended reducing sugar intake

to less than 10% of calories; in 2002, a WHO/FAO

(Food and Agriculture Organization of the United Nations)

Expert Consultation recommended the same. In

1998, ILSI and the World Sugar Research Organisation

clandestinely funded a report from the WHO and the

United Nations concluding that there was no evidence

to set an upper limit for sugar (9), while the sugar industry

blocked the 2002 recommendation from becoming

WHO policy (10).

What can journals, the media, policymakers, and

other stakeholders do to counteract tactics that industry

often uses to advocate for the safety of unsafe products

or question the integrity of science that calls their

products into question? To combat the tobacco industry's

influence over scientific discourse, leading journal

editors recently refused to be passive conduits for articles

funded by the tobacco industry. Accordingly, highquality

journals could refrain from publishing studies

on health effects of added sugars funded by entities

with commercial interests in the outcome.

In summary, our concerns about the funding

source and methods of the current review preclude us

from accepting its conclusion that recommendations to

limit added sugar consumption to less than 10% of calories

are not trustworthy. Policymakers, when confronted

with claims that sugar guidelines are based on

“junk science,” should consider whether “junk food”

was the source.


1. Do Sugar-Sweetened Beverages Cause Obesity and Diabetes? Industry and the Manufacture of Scientific Controversy.

Schillinger D, Tran J, Mangurian C, Kearns C.

Ann Intern Med. 2016 Dec 20;165(12):895-897. doi: 10.7326/L16-0534.

PMID: 27802504



2. Sugar Industry and Coronary Heart Disease Research: A Historical Analysis of Internal Industry Documents.

Kearns CE, Schmidt LA, Glantz SA.

JAMA Intern Med. 2016 Nov 1;176(11):1680-1685. doi: 10.1001/jamainternmed.2016.5394.

PMID: 27617709



Early warning signals of the coronary heart disease (CHD) risk of sugar (sucrose) emerged in the 1950s. We examined Sugar Research Foundation (SRF) internal documents, historical reports, and statements relevant to early debates about the dietary causes of CHD and assembled findings chronologically into a narrative case study. The SRF sponsored its first CHD research project in 1965, a literature review published in the New England Journal of Medicine, which singled out fat and cholesterol as the dietary causes of CHD and downplayed evidence that sucrose consumption was also a risk factor. The SRF set the review's objective, contributed articles for inclusion, and received drafts. The SRF's funding and role was not disclosed. Together with other recent analyses of sugar industry documents, our findings suggest the industry sponsored a research program in the 1960s and 1970s that successfully cast doubt about the hazards of sucrose while promoting fat as the dietary culprit in CHD. Policymaking committees should consider giving less weight to food industry-funded studies and include mechanistic and animal studies as well as studies appraising the effect of added sugars on multiple CHD biomarkers and disease development.


3. The Scientific Basis of Guideline Recommendations on Sugar Intake: A Systematic Review.

Erickson J, Sadeghirad B, Lytvyn L, Slavin J, Johnston BC.

Ann Intern Med. 2016 Dec 20. doi: 10.7326/M16-2020. [Epub ahead of print]

PMID: 27992898




The relationship between sugar and health is affected by energy balance, macronutrient substitutions, and diet and lifestyle patterns. Several authoritative organizations have issued public health guidelines addressing dietary sugars.


To systematically review guidelines on sugar intake and assess consistency of recommendations, methodological quality of guidelines, and the quality of evidence supporting each recommendation.


MEDLINE, EMBASE, and Web of Science (1995 to September 2016); guideline registries; and gray literature (bibliographies, Google, and experts).


Guidelines addressing sugar intake that reported their methods of development and were published in English between 1995 and 2016.


Three reviewers independently assessed guideline quality using the Appraisal of Guidelines for Research and Evaluation, 2nd edition (AGREE II), instrument. To assess evidence quality, articles supporting recommendations were independently reviewed and their quality was determined by using GRADE (Grading of Recommendations Assessment, Development and Evaluation) methods.


The search identified 9 guidelines that offered 12 recommendations. Each of the reviewed guidelines indicated a suggested decrease in the consumption of foods containing nonintrinsic sugars. The guidelines scored poorly on AGREE II criteria, specifically in rigor of development, applicability, and editorial independence. Seven recommendations provided nonquantitative guidance; 5 recommended less than 25% to less than 5% of total calories from nonintrinsic sugars. The recommendations were based on various health concerns, including nutrient displacement, dental caries, and weight gain. Quality of evidence supporting recommendations was low to very low.


The authors conducted the study independent of the funding source, which is primarily supported by the food and agriculture industry.


Guidelines on dietary sugar do not meet criteria for trustworthy recommendations and are based on low-quality evidence. Public health officials (when promulgating these recommendations) and their public audience (when considering dietary behavior) should be aware of these limitations.


Exercise as Medicine.

Katz PP, Pate R.

Ann Intern Med. 2016 Dec 20;165(12):880-881. doi: 10.7326/M16-2086. No abstract available.

PMID: 27668671



Effect of Structured Physical Activity on Overall Burden and Transitions Between States of Major Mobility Disability in Older Persons: Secondary Analysis of a Randomized Trial.

Gill TM, Guralnik JM, Pahor M, Church T, Fielding RA, King AC, Marsh AP, Newman AB, Pellegrini CA, Chen SH, Allore HG, Miller ME; LIFE Study Investigators..

Ann Intern Med. 2016 Dec 20;165(12):833-840. doi: 10.7326/M16-0529.

PMID: 27669457




The total time a patient is disabled likely has a greater influence on his or her quality of life than the initial occurrence of disability alone.


To compare the effect of a long-term, structured physical activity program with that of a health education intervention on the proportion of patient assessments indicating major mobility disability (MMD) (that is, MMD burden) and on the risk for transitions into and out of MMD.


Single-blinded, parallel-group, randomized trial.


8 U.S. centers between February 2010 and December 2013.


1635 sedentary persons, aged 70 to 89 years, who had functional limitations but could walk 400 m.


Physical activity (n = 818) and health education (n = 817).


MMD, defined as the inability to walk 400 m, was assessed every 6 months for up to 3.5 years.


During a median follow-up of 2.7 years, the proportion of assessments showing MMD was substantially lower in the physical activity (0.13 [95% CI, 0.11 to 0.15]) than the health education (0.17 [CI, 0.15 to 0.19]) group, yielding a risk ratio of 0.75 (CI, 0.64 to 0.89). In a multistate model, the hazard ratios for comparisons of physical activity with health education were 0.87 (CI, 0.73 to 1.03) for the transition from no MMD to MMD; 0.52 (CI, 0.10 to 2.67) for no MMD to death; 1.33 (CI, 0.99 to 1.77) for MMD to no MMD; and 1.92 (CI, 1.15 to 3.20) for MMD to death.


The intention-to-treat principle was maintained for MMD burden and first transition out of no MMD, but not for subsequent transitions.


A structured physical activity program reduced the MMD burden for an extended period, in part through enhanced recovery after the onset of disability and diminished risk for subsequent disability episodes.


https://www.ncbi.nlm.nih.gov/pubmed/22469110>> Dynapenia (pronounced dahy-nuh-pē-nē-a, Greek translation for poverty of strength, power, or force) is the age-associated loss of muscle strength that is not caused by neurologic or muscular diseases.

Dynapenia and Metabolic Health in Obese and Nonobese Adults Aged 70 Years and Older: The LIFE Study.

Aubertin-Leheudre M, Anton S, Beavers DP, Manini TM, Fielding R, Newman A, Church T, Kritchevsky SB, Conroy D, McDermott MM, Botoseneanu A, Hauser ME, Pahor M; LIFE Research Group..

J Am Med Dir Assoc. 2016 Nov 30. pii: S1525-8610(16)30468-6. doi: 10.1016/j.jamda.2016.10.001. [Epub ahead of print]

PMID: 27914851




The purpose of this study was to examine the relationship between dynapenia and metabolic risk factors in obese and nonobese older adults.


A total of 1453 men and women (age ≥70 years) from the Lifestyle Interventions and Independence for Elders (LIFE) Study were categorized as (1) nondynapenic/nonobese (NDYN-NO), (2) dynapenic/nonobese (DYN-NO), (3) nondynapenic/obese (NDYN-O), or (4) dynapenic/obese (DYN-O), based on muscle strength (Foundation for the National Institute of Health criteria) and body mass index. Dependent variables were blood lipids, fasting glucose, blood pressure, presence of at least 3 metabolic syndrome (MetS) criteria, and other chronic conditions.


A significantly higher likelihood of having abdominal obesity criteria in NDYN-NO compared with DYN-NO groups (55.6 vs 45.1%, P ≤ .01) was observed. Waist circumference also was significantly higher in obese groups (DYN-O = 114.0 ± 12.9 and NDYN-O = 111.2 ± 13.1) than in nonobese (NDYN-NO = 93.1 ± 10.7 and DYN-NO = 92.2 ± 11.2, P ≤ .01); and higher in NDYN-O compared with DYN-O (P = .008). Additionally, NDYN-O demonstrated higher diastolic blood pressure compared with DYN-O (70.9 ± 10.1 vs 67.7 ± 9.7, P ≤ .001). No significant differences were found across dynapenia and obesity status for all other metabolic components (P > .05). The odds of having MetS or its individual components were similar in obese and nonobese, combined or not with dynapenia (nonsignificant odds ratio [95% confidence interval]).


Nonobese dynapenic older adults had fewer metabolic disease risk factors than nonobese and nondynapenic older adults. Moreover, among obese older adults, dynapenia was associated with lower risk of meeting MetS criteria for waist circumference and diastolic blood pressure. Additionally, the presence of dynapenia did not increase cardiometabolic disease risk in either obese or nonobese older adults.


Muscle strength; aging; metabolic syndrome; obesity


Low Hemoglobin Levels and the Onset of Cognitive Impairment in Older People: The PRO.V.A. Study.

Trevisan C, Veronese N, Bolzetta F, De Rui M, Maggi S, Zambon S, Musacchio E, Sartori L, Perissinotto E, Crepaldi G, Manzato E, Sergi G.

Rejuvenation Res. 2016 Dec;19(6):447-455.

PMID: 26778182



Low hemoglobin (Hb) levels are attracting interest as a risk factor for cognitive impairment, but with contrasting evidence emerging from the current literature. The aim of our work was to investigate the relationship between baseline serum Hb levels and the incidence of cognitive impairment in older people over a follow-up of 4.4 years. Our study considered a sample of 1227 elderly subjects cognitively intact at baseline, enrolled under the Progetto Veneto Anziani (Pro.V.A.) among 3099 screened subjects. For all participants, we measured serum Hb levels on blood samples; incident cognitive impairment was defined as a Mini Mental State Examination (MMSE) score <24 and confirmed by geriatricians skilled in psychogeriatric medicine. No differences in baseline MMSE scores across Hb tertiles emerged in either gender. After the 4.4 years of follow-up, we identified 403 new cases of cognitive impairment (147 men and 256 women). Cox's regression analysis showed that participants with the lowest baseline Hb concentrations carried a significant 37% higher risk (95% confidence interval [CI]: 1.08-1.75; p = 0.01) of being diagnosed with cognitive impairment during the follow-up. Considering the gender separately, the risk of cognitive impairment only increased significantly, by 60%, for men in the lowest Hb tertile (95% CI: 1.06-2.41; p = 0.02), but not for women (hazard ratio = 1.32; 95% CI: 0.97-1.79; p = 0.08). In conclusion, low Hb concentrations may predict the onset of cognitive impairment in the elderly, apparently with a stronger association in men than in women.


anemia; cognitive impairment; dementia; elderly; hemoglobin concentration; sex differences

Edited by AlPater
Link to comment
Share on other sites

Persistent microbiome alterations modulate the rate of post-dieting weight regain

Christoph A. Thaiss, Shlomik Itav, Daphna Rothschild, Mariska T. Meijer, Maayan Levy, Claudia Moresi, Lenka Dohnalová, Sofia Braverman, Shachar Rozin, Sergey Malitsky, Mally Dori-Bachash, Yael Kuperman, Inbal Biton, Arieh Gertler, Alon Harmelin, Hagit Shapiro, Zamir Halpern, Asaph Aharoni, Eran Segal & Eran Elinav

Nature 540, 544–551 (22 December 2016) doi:10.1038/nature20796

Received 22 February 2016 Accepted 18 November 2016 Published online 24 November 2016


Journal pre-blurb: The identification of an intestinal microbiome signature that persists after successful dieting in obese mice and contributes to faster weight regain upon re-exposure to an obesity-promoting diet, and that transmits the altered weight regain phenotype to non-dieting mice.


In tackling the obesity pandemic, considerable efforts are devoted to the development of effective weight reduction strategies, yet many dieting individuals fail to maintain a long-term weight reduction, and instead undergo excessive weight regain cycles. The mechanisms driving recurrent post-dieting obesity remain largely elusive. Here we identify an intestinal microbiome signature that persists after successful dieting of obese mice and contributes to faster weight regain and metabolic aberrations upon re-exposure to obesity-promoting conditions. Faecal transfer experiments show that the accelerated weight regain phenotype can be transmitted to germ-free mice. We develop a machine-learning algorithm that enables personalized microbiome-based prediction of the extent of post-dieting weight regain. Additionally, we find that the microbiome contributes to diminished post-dieting flavonoid levels and reduced energy expenditure, and demonstrate that flavonoid-based ‘post-biotic’ intervention ameliorates excessive secondary weight gain. Together, our data highlight a possible microbiome contribution to accelerated post-dieting weight regain, and suggest that microbiome-targeting approaches may help to diagnose and treat this common disorder.

Subject terms: Microbiology Medical research Microbial communities Metabolism


Patients treated by female doctors more likely to leave hospital alive: Harvard study

Research finds link between doctors' gender and older patients' survival

CBC News Posted: Dec 20, 2016




Women in Medicine and Patient Outcomes

Equal Rights for Better Work?

Anna L. Parks, MD1,2; Rita F. Redberg, MD, MSc1,3

JAMA Intern Med. Published online December 19, 2016. doi:10.1001/jamainternmed.2016.7883


If you are aware of the disparities between genders in academic medicine, recent publications show multiple areas with opportunities for improvement. Jena and colleagues1 found that female physicians in academia were less likely than their male counterparts to have reached the rank of full professor (11.9% vs 28.6%). Serge et al2 reported that start-up funding packages—which help launch faculty careers—were 67.5% higher for men than for women ($980 000 vs $585 000). Finally, Jena and colleagues3 reported that salaries for female academic physicians are $19 879, or 8.0%, lower than those of their male colleagues.

Among the myriad rationalizations for these disparities between the genders in academic medicine, some have suggested that the burden of home responsibilities, leave for childbearing, or part-time schedules might undermine the quality of female physicians’ work and explain male physicians’ higher salaries. In this issue of JAMA Internal Medicine, Tsugawa et al4 find that the evidence shows the opposite for all internists, not just those in academia, as detailed above.

The group examined data from hospitalized Medicare patients and found that patients treated by female internists fared better than patients treated by male internists, with lower 30-day readmissions (15.02% vs 15.57%) and lower 30-day mortality (11.07% vs 11.49%).4 The differences persisted across 8 medical conditions ranging from arrhythmia to sepsis. Improvements in mortality were strongest for the most severely ill patients. A sensitivity analysis restricted to hospitalists, to whom patients were presumably randomly assigned, found no difference in the severity of illness for patients according to physician gender and confirmed better outcomes for patients treated by female physicians.

We support investigation of practice patterns that mediate improved clinical outcomes. Tsugawa et al4 suggest that these improved outcomes may be the result of female physicians’ greater reliance on clinical guidelines, but such adherence does not always equate with quality or value of care,5 so additional attributes should be examined. Previous work has shown that female physicians have a more patient-centered communication style, are more encouraging and reassuring, and have longer visits than male physicians.6,7 In a system that is increasingly focused on pay for performance, behaviors that lead to improved outcomes are rewarded, which might narrow the pay gap between the genders. Moreover, these findings that female internists provide higher quality care for hospitalized patients yet are promoted, supported, and paid less than male peers in the academic setting should push us to create systems that promote equity in start-up packages, career advancement, and remuneration for all physicians. Such equity promises to result in better professional fulfillment for all physicians as well as improved patient satisfaction and outcomes.


Comparison of Hospital Mortality and Readmission Rates for Medicare Patients Treated by Male vs Female Physicians

Yusuke Tsugawa, MD, MPH, PhD1,2; Anupam B. Jena, MD, PhD3,4,5; Jose F. Figueroa, MD, MPH1,2; et al E. John Orav, PhD2,6; Daniel M. Blumenthal, MD, MBA7; Ashish K. Jha, MD, MPH1,2,8

JAMA Intern Med. Published online December 19, 2016. doi:10.1001/jamainternmed.2016.7875


Key Points

Question Do patient outcomes differ between those treated by male and female physicians?

Findings In this cross-sectional study, we examined nationally representative data of hospitalized Medicare beneficiaries and found that patients treated by female physicians had significantly lower mortality rates (adjusted mortality rate, 11.07% vs 11.49%) and readmission rates (adjusted readmission rate, 15.02% vs 15.57%) compared with those cared for by male physicians within the same hospital.

Meaning Differences in practice patterns between male and female physicians, as suggested in previous studies, may have important clinical implications for patient outcomes.


Importance Studies have found differences in practice patterns between male and female physicians, with female physicians more likely to adhere to clinical guidelines and evidence-based practice. However, whether patient outcomes differ between male and female physicians is largely unknown.

Objective To determine whether mortality and readmission rates differ between patients treated by male or female physicians.

Design, Setting, and Participants We analyzed a 20% random sample of Medicare fee-for-service beneficiaries 65 years or older hospitalized with a medical condition and treated by general internists from January 1, 2011, to December 31, 2014. We examined the association between physician sex and 30-day mortality and readmission rates, adjusted for patient and physician characteristics and hospital fixed effects (effectively comparing female and male physicians within the same hospital). As a sensitivity analysis, we examined only physicians focusing on hospital care (hospitalists), among whom patients are plausibly quasi-randomized to physicians based on the physician’s specific work schedules. We also investigated whether differences in patient outcomes varied by specific condition or by underlying severity of illness.

Main Outcomes and Measures Patients’ 30-day mortality and readmission rates.

Results A total of 1 583 028 hospitalizations were used for analyses of 30-day mortality (mean [sD] patient age, 80.2 [8.5] years; 621 412 men and 961 616 women) and 1 540 797 were used for analyses of readmission (mean [sD] patient age, 80.1 [8.5] years; 602 115 men and 938 682 women). Patients treated by female physicians had lower 30-day mortality (adjusted mortality, 11.07% vs 11.49%; adjusted risk difference, –0.43%; 95% CI, –0.57% to –0.28%; P < .001; number needed to treat to prevent 1 death, 233) and lower 30-day readmissions (adjusted readmissions, 15.02% vs 15.57%; adjusted risk difference, –0.55%; 95% CI, –0.71% to –0.39%; P < .001; number needed to treat to prevent 1 readmission, 182) than patients cared for by male physicians, after accounting for potential confounders. Our findings were unaffected when restricting analyses to patients treated by hospitalists. Differences persisted across 8 common medical conditions and across patients’ severity of illness.

Conclusions and Relevance Elderly hospitalized patients treated by female internists have lower mortality and readmissions compared with those cared for by male internists. These findings suggest that the differences in practice patterns between male and female physicians, as suggested in previous studies, may have important clinical implications for patient outcomes.


Reducing Sodium Intake in the Population.

McCarron DA, Alderman MH.

JAMA. 2016 Dec 20;316(23):2550. doi: 10.1001/jama.2016.16100. No abstract available.

PMID: 27997645



Reducing Sodium Intake in the Population-Reply.

Frieden TR.

JAMA. 2016 Dec 20;316(23):2550-2551. doi: 10.1001/jama.2016.16106. No abstract available.

PMID: 27997649



Sodium Restriction in Patients With CKD: A Randomized Controlled Trial of Self-management Support.

Meuleman Y, Hoekstra T, Dekker FW, Navis G, Vogt L, van der Boog PJ, Bos WJ, van Montfrans GA, van Dijk S; ESMO Study Group..

Am J Kidney Dis. 2016 Dec 16. pii: S0272-6386(16)30574-1. doi: 10.1053/j.ajkd.2016.08.042. [Epub ahead of print]

PMID: 27993433




To evaluate the effectiveness and sustainability of self-managed sodium restriction in patients with chronic kidney disease.


Open randomized controlled trial.


Patients with moderately decreased kidney function from 4 hospitals in the Netherlands.


Regular care was compared with regular care plus an intervention comprising education, motivational interviewing, coaching, and self-monitoring of blood pressure (BP) and sodium.


Primary outcomes were sodium excretion and BP after the 3-month intervention and at 6-month follow-up. Secondary outcomes were protein excretion, kidney function, antihypertensive medication, self-efficacy, and health-related quality of life (HRQoL).


At baseline, mean sodium excretion rate was 163.6±64.9 (SD) mmol/24 h; mean estimated glomerular filtration rate was 49.7±25.6mL/min/1.73m2; median protein excretion rate was 0.8 (IQR, 0.4-1.7) g/24 h; and mean 24-hour ambulatory systolic and diastolic BPs were 129±15 and 76±9mmHg, respectively. Compared to regular care only (n=71), at 3 months, the intervention group (n=67) showed reduced sodium excretion rate (mean change, -30.3 [95% CI, -54.7 to -5.9] mmol/24 h), daytime ambulatory diastolic BP (mean change, -3.4 [95% CI, -6.3 to -0.6] mmHg), diastolic office BP (mean change, -5.2 [95% CI, -8.4 to -2.1] mmHg), protein excretion (mean change, -0.4 [95% CI, -0.7 to -0.1] g/24h), and improved self-efficacy (mean change, 0.5 [95% CI, 0.1 to 0.9]). At 6 months, differences in sodium excretion rates and ambulatory BPs between the groups were not significant, but differences were detected in systolic and diastolic office BPs (mean changes of -7.3 [95% CI, -12.7 to -1.9] and -3.8 [95% CI, -6.9 to -0.6] mmHg, respectively), protein excretion (mean changes, -0.3 [95% CI, -0.6 to -0.1] g/24h), and self-efficacy (mean change, 0.5 [95% CI, 0.0 to 0.9]). No differences in kidney function, medication, and HRQoL were observed.


Nonblinding, relatively low response rate, and missing data.


Compared to regular care only, this self-management intervention modestly improved outcomes, although effects on sodium excretion and ambulatory BP diminish over time.


Behavior change; blood pressure; chronic kidney disease (CKD); dietary sodium intake; disease progression; health-related quality of life (HRQoL); hypertension; kidney function; lifestyle interventions; modifiable risk factor; nutrition; protein excretion; randomized controlled trial; self-efficacy; self-managment support


Anticholinergic Exposure in a cohort of adults aged 80 years and over Associations of the MARANTE scale with mortality and hospitalisation.

Wauters M, Klamer T, Elseviers M, Vaes B, Dalleur O, Degryse J, Durán C, Christiaens T, Azermai M, Vander Stichele R.

Basic Clin Pharmacol Toxicol. 2016 Dec 20. doi: 10.1111/bcpt.12744. [Epub ahead of print]

PMID: 27995743



Anticholinergics are frequently prescribed for older adults and can lead to adverse drug events. The novel MARANTE (Muscarinic Acetylcholinergic Receptor ANTagonist Exposure) scale measures the anticholinergic exposure by incorporating potency and dosages of each medication into its calculations. The aims were to assess prevalence and intensity of the anticholinergic exposure in a longitudinal cohort study of community-dwelling patients aged 80 years and over (n=503) and to study the impact on mortality and hospitalisation. Chronic medication use at baseline (November 2008 - September 2009) was entered and codified with the Anatomical Therapeutic Chemical classification. Time-to-event analysis until first hospitalisation or death was performed at 18 months after inclusion, using Kaplan-Meier curves. Cox regression was performed to control for covariates. Mean age was 84 years (range 80 - 102), and mean number of medications was 5 (range 0 - 16). Prevalence of anticholinergic use was 31.8%, with 9% taking ≥2 anticholinergics (range 0 - 4). Main indications for anticholinergics were depression, pain and gastric dysfunction. Female gender, the level of multimorbidity and the number of medications were associated with anticholinergic use. Mortality and hospitalisation rate were 8.9%, and 31.0% respectively. After adjustment for the level of multimorbidity and medication intake, multivariable analysis showed increased risks of mortality (HR 2.3, 95%CI 1.07 - 4.78) and hospitalisation (HR 1.7; 95%CI 1.13 - 2.59) in those with high anticholinergic exposure. The longitudinal study among Belgian community-dwelling oldest old demonstrated great anticholinergic exposure, which was associated with increased risk of mortality and hospitalization after 18 months.

Link to comment
Share on other sites

Association of Dynamics in Lean and Fat Mass Measures with Mortality in Frail Older Women.

Zaslavsky O, Rillamas-Sun E, Li W, Going S, Datta M, Snetselaar L, Zelber-Sagi S.

J Nutr Health Aging. 2017;21(1):112-119. doi: 10.1007/s12603-016-0730-1.

PMID: 27999857




The relationship between body composition and mortality in frail older people is unclear. We used dual-x-ray absorptiometry (DXA) data to examine the association between dynamics in whole-body composition and appendicular (4 limbs) and central (trunk) compartments and all-cause mortality in frail older women.


Prospective study with up to 19 years of follow up.


Community dwelling older (≥65) women.


876 frail older participants of the Women's Health Initiative Observational Study with a single measure of body composition and 581 participants with two measures.


Frailty was determined using modified Fried's criteria. All-cause mortality hazard was modeled as a function of static (single-occasion) or dynamic changes (difference between two time points) in body composition using Cox regression.


Analyses adjusted for age, ethnicity, income, smoking, cardiovascular disease, diabetes, stroke, number of frailty criteria and whole-body lean mass showed progressively decreased rates of mortality in women with higher appendicular fat mass (FM) (P for trend=0.01), higher trunk FM (P for trend=0.03) and higher whole-body FM (P for trend=0.01). The hazard rate ratio for participants with more than a 5% decline in FM between tw