Jump to content
Dean Pomerleau

Dietary Fiber - Health Promoter or Anti-CR Hunger-Suppressor?

Recommended Posts

All,

 

There hasn't been much talk around here lately directly related to CR - so I figured now is a good time to bring up a topic I've been puzzling over for a while now. I wonder if anyone else is feeling the same cognitive dissonance that I am.

 

It involves the apparent health benefits of fiber on the one-hand, and the so-called "Hunger Hypothesis" (HH) on the other. In a nutshell, the HH is the idea that experiencing hunger may be important (some say critical) for manifesting the benefits of CR.

 

The benefits of fiber were highlighted recently by this study [1]. It followed 1600+ older adults (49 years and older) for 10 years using repeated food frequency questionnaires to assess diet and it's relationship to "healthy aging", defined as "absence of disability, depressive symptoms, cognitive impairment, respiratory symptoms, and chronic diseases (eg, cancer and coronary artery disease)."

 

It found that folks in the highest quartile of fiber intake were nearly 80% more likely to age successfully than those in the lowest quartile. Interestingly, vegetable fiber wasn't as protective as fiber from fruit or grains/cereal. 

 

But if we know anything, we know that a high fiber, high volume, low-GI diet is a great way to reduce the hunger that accompanies CR, and that some say, may be required for CR to be beneficial - the so-called "Hunger Hypothesis".

 

Michael discusses the HH in his comprehensive SENS blog post on the Primate CR studies - suggesting it might be an explanation for the disappointing monkey results based on the fact that over the years the monkeys in the NIA's CR group appeared to become less motivated by food [3], suggesting they weren't experiencing much hunger.

 

He suggests neuropeptide Y (NPY) or ghrelin as two potential candidate signalling molecules associated with hunger that might mediate the HH effect on longevity. He focuses a lot on NPY, since it seems to be elevated both by acute fasting and at least by several months of chronic CR - which makes it unusual among hormones and neuropeptides involved in energy homeostasis, which generally tend to return to baseline after a few week or months of chronic energy restriction.

 

But the evidence he provides in that blog post to support the involvement of NPY (or ghrelin for that matter) in the longevity benefits of CR seems to me to be pretty scant.

 

He suggests the lack of a drop blood pressure in the CR monkeys is suggestive of a low NPY level, since both CR and elevated NPY are usually accompanied by a drop in blood pressure. But there are lots of things affect BP besides NPY, so his reasoning seems like a pretty big stretch. And even if it were a lack of elevated NPY that explained why the CR monkey's BP didn't drop, that still doesn't say anything (directly at least) about whether elevated NPY (a surrogate for hunger) has anything to do with the lifespan effects of CR. Although high BP is the world's #1 cause of early preventable death, ahead of tobacco and alcohol use [2], I don't think anyone (esp. Michael) would claim that you can gain CR lifespan benefits simply by reducing your BP, e.g. through sodium restriction or blood pressure medication. So if NPY is going to affect longevity, it probably isn't through its BP-lowering effect.

 

The evidence he provide to suggest a direct link between hunger (and esp, elevated NPY) and longevity seems similarly weak and tenuous. He cites [5] which found reducing NPY via lesion or genetic mutation prevents CR from protecting mice against skin cancer. He also cites [6], a study of a drug that, among several effects relating to serotonin, may possibly (Michael's word) block the effect of NPY. Rats given the drug ate 10% less food when fed ad lib than rats not given the drug, but didn't live any longer (except for the male rats on a medium dose, who did live longer). As I said, pretty tenuous evidence for a link between NPY and longevity if you ask me.

 

If it were just Michael and the dubious evidence he provides, I think the HH could be pretty easily dismissed. But he's not the only one who advocates for it. TomB's been promoting the HH idea for a while, and even claims to be its originator. In that post he says:

 

I'm more convinced than ever that this is true - and there are more and more papers coming out that highlight the vital role of ghrelin and neuropeptide-Y and other molecules in mediating many of the [CR] benefits.

 

Tom - care to back up that bold claim with an argument and citations that are more convincing than what Michael points to?

 

But it's not just amateur scientists like Michael and Tom promoting the HH.

 

Dr. Speakman (who spoke at our recent CR Conference) is also an advocate for the HH. As exhibit A, he says rodents remain hungry when subjected to prolonged CR [7], which by itself is neither surprising nor especially strong evidence in favor of the HH.

 

But in [8] (discussed here) he goes more or less all-in for the HH. In it he calls a high fiber diet "calorie dilution" rather than "calorie restriction". He claims rodents allowed to eat as much as they want of a high fiber diet become satiated and therefore stop eating voluntarily before consuming as many calories as a rodents fed normal chow ad lib. He suggests in [8] that this calorie dilution effect is the explanation for the recent, blasphemous Solon-Biet study [9]. Solon-Biet et al suggest that it is protein restriction (PR), and not calorie restriction, that mediates the observed benefits of CR via a PR-induced induced reduction in mTOR activity, saying in [9] that:

 

 Longevity and health were optimized when protein was replaced with
carbohydrate to limit compensatory feeding for protein and suppress protein
intake. These consequences are associated with hepatic mammalian target of
rapamycin (mTOR) activation and mitochondrial function and, in turn, related to
circulating branched-chain amino acids and glucose. Calorie restriction achieved 
by high-protein diets or dietary dilution had no beneficial effects on lifespan. 
The results suggest that longevity can be extended in ad libitum-fed animals by
manipulating the ratio of macronutrients to inhibit mTOR activation.

 

Speakman begs to differ. He suggests in [8] that Solon-Biet et al employed a calorie dilution paradigm, feeding all their mice ad lib, but adding fiber to modulate calorie intake on the different diet. Speakman says this is a bad idea. In his view, rodents need to be hungry to live longer as a result of CR, and so diluting their food with non-nutritive fiber so they are satisfied eating fewer calories won't trigger CR benefits. 

 

Maybe I'm missing something, but Speakman's claim about [9] puzzles me. Why? Because in [9] the mice fed a low-protein, high-carb diet ate more (both volume-wise and calorie-wise), and got fatter as a result, but did in fact live longer, seemingly in contradiction to Speakman's claim that being satiated trounces longevity. Here is a handy graphical abstract of [9] to get a better feel for what I mean:

 

fx1.jpg

 

See this post for more discussion of [9] and Speakman's interpretation of it.

 

Overall, despite my respect for Michael, Tom and Dr. Speakman, I'm dubious. First, I'm dubious in general about the promise of serious (hunger-inducing) CR to extend lifespan significantly more than an obesity-avoiding diet and lifestyle. The full evidence can be found in this thread, but a big part of it is data from the vegan Adventists, who live longer, eat more and are a lot heavier than Okinawans, the latter of which following a traditional, much lower calorie diet. More to the point, vegan Adventists following a healthy diet and lifestyle live 10-14 years longer than the general population, and eat 3x as much fiber as the average American (46g vs. 15g). If a high fiber diet is so bad, why do the vegan Adventists do so well on it?

 

The combination of the evidence from the Adventists and from [1] supporting the health and longevity promoting effects of fiber in humans, and the weak rodent evidence supporting the Hunger Hypothesis makes me pretty dubious about any deleterious effects of fiber or feeling satiated. 

 

But I've got an open mind on the subject. Would any of you HH advocates care to take a shot at convincing me and the rest of the fiber-munching CR folks around here of the validity of your perspective - namely that we we are deluding ourselves by diluting our diets and still hoping to enjoy CR benefits? Or put another way, that we need to be CRed and hungry to benefit from CR.

 

--Dean

 

----------

[1] J Gerontol A Biol Sci Med Sci. 2016 Jun 1. pii: glw091. [Epub ahead of print]

 
Association Between Carbohydrate Nutrition and Successful Aging Over 10 Years.
 
Gopinath B(1), Flood VM(2), Kifley A(3), Louie JC(4), Mitchell P(3).
 
 
BACKGROUND: We prospectively examined the relationship between dietary glycemic
index (GI) and glycemic load (GL), carbohydrate, sugars, and fiber intake
(including fruits, vegetable of breads/cereals fiber) with successful aging
(determined through a multidomain approach).
 
METHODS: A total of 1,609 adults aged 49 years and older who were free of cancer,
coronary artery disease, and stroke at baseline were followed for 10 years.
Dietary data were collected using a semiquantitative Food Frequency
Questionnaire. Successful aging status was determined through
interviewer-administered questionnaire at each visit and was defined as the
absence of disability, depressive symptoms, cognitive impairment, respiratory
symptoms, and chronic diseases (eg, cancer and coronary artery disease).
RESULTS: In all, 249 (15.5%) participants had aged successfully 10 years later.
Dietary GI, GL, and carbohydrate intake were not significantly associated with
successful aging. However, participants in the highest versus lowest (reference
group) quartile of total fiber intake had greater odds of aging successfully than
suboptimal aging, multivariable-adjusted odds ratio (OR), 1.79 (95% confidence
interval [CI] 1.13-2.84). Those who remained consistently below the median in
consumption of fiber from breads/cereal and fruit compared with the rest of
cohort were less likely to age successfully, OR 0.53 (95% CI 0.34-0.84) and OR
0.64 (95% CI 0.44-0.95), respectively.
 
CONCLUSIONS: Consumption of dietary fiber from breads/cereals and fruits
independently influenced the likelihood of aging successfully over 10 years.
These findings suggest that increasing intake of fiber-rich foods could be a
successful strategy in reaching old age disease free and fully functional.
 
PMID: 27252308
 
----------
[1] Lancet. 2012 Dec 15;380(9859):2224-60. doi: 10.1016/S0140-6736(12)61766-8.
 
A comparative risk assessment of burden of disease and injury attributable to 67 
risk factors and risk factor clusters in 21 regions, 1990-2010: a systematic
analysis for the Global Burden of Disease Study 2010.
 
Lim SS, et al
 
BACKGROUND: Quantification of the disease burden caused by different risks
informs prevention by providing an account of health loss different to that
provided by a disease-by-disease analysis. No complete revision of global disease
burden caused by risk factors has been done since a comparative risk assessment
in 2000, and no previous analysis has assessed changes in burden attributable to 
risk factors over time.
METHODS: We estimated deaths and disability-adjusted life years (DALYs; sum of
years lived with disability [YLD] and years of life lost [YLL]) attributable to
the independent effects of 67 risk factors and clusters of risk factors for 21
regions in 1990 and 2010. We estimated exposure distributions for each year,
region, sex, and age group, and relative risks per unit of exposure by
systematically reviewing and synthesising published and unpublished data. We used
these estimates, together with estimates of cause-specific deaths and DALYs from 
the Global Burden of Disease Study 2010, to calculate the burden attributable to 
each risk factor exposure compared with the theoretical-minimum-risk exposure. We
incorporated uncertainty in disease burden, relative risks, and exposures into
our estimates of attributable burden.
FINDINGS: In 2010, the three leading risk factors for global disease burden were 
high blood pressure (7·0% [95% uncertainty interval 6·2-7·7] of global DALYs),
tobacco smoking including second-hand smoke (6·3% [5·5-7·0]), and alcohol use
(5·5% [5·0-5·9]). In 1990, the leading risks were childhood underweight (7·9%
[6·8-9·4]), household air pollution from solid fuels (HAP; 7·0% [5·6-8·3]), and
tobacco smoking including second-hand smoke (6·1% [5·4-6·8]). Dietary risk
factors and physical inactivity collectively accounted for 10·0% (95% UI
9·2-10·8) of global DALYs in 2010, with the most prominent dietary risks being
diets low in fruits and those high in sodium. Several risks that primarily affect
childhood communicable diseases, including unimproved water and sanitation and
childhood micronutrient deficiencies, fell in rank between 1990 and 2010, with
unimproved water and sanitation accounting for 0·9% (0·4-1·6) of global DALYs in 
2010. However, in most of sub-Saharan Africa childhood underweight, HAP, and
non-exclusive and discontinued breastfeeding were the leading risks in 2010,
while HAP was the leading risk in south Asia. The leading risk factor in Eastern 
Europe, most of Latin America, and southern sub-Saharan Africa in 2010 was
alcohol use; in most of Asia, North Africa and Middle East, and central Europe it
was high blood pressure. Despite declines, tobacco smoking including second-hand 
smoke remained the leading risk in high-income north America and western Europe. 
High body-mass index has increased globally and it is the leading risk in
Australasia and southern Latin America, and also ranks high in other high-income 
regions, North Africa and Middle East, and Oceania.
INTERPRETATION: Worldwide, the contribution of different risk factors to disease 
burden has changed substantially, with a shift away from risks for communicable
diseases in children towards those for non-communicable diseases in adults. These
changes are related to the ageing population, decreased mortality among children 
younger than 5 years, changes in cause-of-death composition, and changes in risk 
factor exposures. New evidence has led to changes in the magnitude of key risks
including unimproved water and sanitation, vitamin A and zinc deficiencies, and
ambient particulate matter pollution. The extent to which the epidemiological
shift has occurred and what the leading risks currently are varies greatly across
regions. In much of sub-Saharan Africa, the leading risks are still those
associated with poverty and those that affect children.
FUNDING: Bill & Melinda Gates Foundation.
 
Copyright © 2012 Elsevier Ltd. All rights reserved.
 
PMCID: PMC4156511
PMID: 23245609
 
-----------
[3] Mattison JA, Black A, Huck J, Moscrip T, Handy A, Tilmont E, Roth GS, Lane MA, Ingram DK. Age-related decline in caloric intake and motivation for food in rhesus monkeys. Neurobiol Aging. 2005 Jul;26(7):1117-27. Epub 2004 Dec 10. PubMed PMID: 15748792.
 
[4] Minor RK, Chang JW, de Cabo R. Hungry for life: How the arcuate nucleus and neuropeptide Y may play a critical role in mediating the benefits of calorie restriction. Mol Cell Endocrinol. 2009 Feb 5;299(1):79-88. doi: 10.1016/j.mce.2008.10.044. Epub 2008 Nov 11. Review. PubMed PMID: 19041366; PubMed Central PMCID: PMC2668104.
 
[5] Minor RK, López M, Younts CM, Jones B, Pearson KJ, Anson RM, Diéguez C, de Cabo R. The arcuate nucleus and neuropeptide Y contribute to the antitumorigenic effect of calorie restriction. Aging Cell. 2011 Jun;10(3):483-92. doi: 10.1111/j.1474-9726.2011.00693.x. Epub 2011 Apr 5. PubMed PMID: 21385308; PubMed Central PMCID: PMC3094497.
 
[6] Smith DL Jr, Robertson HT, Desmond RA, Nagy TR, Allison DB. No compelling evidence that sibutramine prolongs life in rodents despite providing a dose-dependent reduction in body weight. Int J Obes (Lond). 2011 May;35(5):652-7. doi: 10.1038/ijo.2010.247. Epub 2010 Nov 16. PubMed PMID: 21079617; PubMed Central PMCID: PMC3091992.
 
[7] Hambly C, Mercer JG, Speakman JR. Hunger does not diminish over time in mice under protracted caloric restriction. Rejuvenation Res. 2007 Dec;10(4):533-42. PubMed PMID: 17990972.

 

--------------

[8]  Exp Gerontol. 2016 Mar 19. pii: S0531-5565(16)30069-9. doi:

10.1016/j.exger.2016.03.011. [Epub ahead of print]
 
Calories or protein? The effect of dietary restriction on lifespan in rodents is 
explained by calories alone.
 
Speakman JR(1), Mitchell SE(2), Mazidi M(3).
 
 
Almost exactly 100years ago Osborne and colleagues demonstrated that restricting 
the food intake of a small number of female rats extended their lifespan. In the 
1930s experiments on the impact of diet on lifespan were extended by Slonaker,
and subsequently McCay. Slonaker concluded that there was a strong impact of
protein intake on lifespan, while McCay concluded that calories are the main
factor causing differences in lifespan when animals are restricted (Calorie
restriction or CR). Hence from the very beginning the question of whether food
restriction acts on lifespan via reduced calorie intake or reduced protein intake
was disputed. Subsequent work supported the idea that calories were the dominant 
factor. More recently, however, this role has again been questioned, particularly
in studies of insects. Here we review the data regarding previous studies of
protein and calorie restriction in rodents. We show that increasing CR (with
simultaneous protein restriction: PR) increases lifespan, and that CR with no PR 
generates an identical effect. None of the residual variation in the impact of CR
(with PR) on lifespan could be traced to variation in macronutrient content of
the diet. Other studies show that low protein content in the diet does increase
median lifespan, but the effect is smaller than the CR effect. We conclude that
CR is a valid phenomenon in rodents that cannot be explained by changes in
protein intake, but that there is a separate phenomenon linking protein intake to
lifespan, which acts over a different range of protein intakes than is typical in
CR studies. This suggests there may be a fundamental difference in the responses 
of insects and rodents to CR. This may be traced to differences in the physiology
of these groups, or reflect a major methodological difference between
'restriction' studies performed on rodents and insects. We suggest that studies
where the diet is supplied ad libitum, but diluted with inert components, should 
perhaps be called dietary or caloric dilution, rather than dietary or caloric
restriction, to distinguish these potentially important methodological
differences.
 
Copyright © 2016 Elsevier Inc. All rights reserved.
 
PMID: 27006163

 

----------------

[9] Cell Metab. 2014 Mar 4;19(3):418-30. doi: 10.1016/j.cmet.2014.02.009.

 
The ratio of macronutrients, not caloric intake, dictates cardiometabolic health,
aging, and longevity in ad libitum-fed mice.
 
Solon-Biet SM(1), McMahon AC(2), Ballard JW(3), Ruohonen K(4), Wu LE(5), Cogger
VC(2), Warren A(2), Huang X(2), Pichaud N(3), Melvin RG(6), Gokarn R(7), Khalil
M(8), Turner N(9), Cooney GJ(9), Sinclair DA(10), Raubenheimer D(11), Le Couteur 
DG(12), Simpson SJ(13).
 
 
Comment in
    Science. 2014 Mar 7;343(6175):1068.
 
The fundamental questions of what represents a macronutritionally balanced diet
and how this maintains health and longevity remain unanswered. Here, the
Geometric Framework, a state-space nutritional modeling method, was used to
measure interactive effects of dietary energy, protein, fat, and carbohydrate on 
food intake, cardiometabolic phenotype, and longevity in mice fed one of 25 diets
ad libitum. Food intake was regulated primarily by protein and carbohydrate
content. Longevity and health were optimized when protein was replaced with
carbohydrate to limit compensatory feeding for protein and suppress protein
intake. These consequences are associated with hepatic mammalian target of
rapamycin (mTOR) activation and mitochondrial function and, in turn, related to
circulating branched-chain amino acids and glucose. Calorie restriction achieved 
by high-protein diets or dietary dilution had no beneficial effects on lifespan. 
The results suggest that longevity can be extended in ad libitum-fed animals by
manipulating the ratio of macronutrients to inhibit mTOR activation.
 
Copyright © 2014 Elsevier Inc. All rights reserved.
 
PMID: 24606899

Share this post


Link to post
Share on other sites

Dean, an experiment that seemed to me to address whether hunger-associated hormones are involved in longevity is (1).  The study 

 

The methods they used were:

 

 

 

Based on the Baltimore Longitudinal Study of Aging, “Long-lived” participants who survived to at least 90 years of age (n=41, cases) were compared with “Short-lived” participants who died between 72–76 years of age (n=31, controls) in the nested case control study. Circulating levels of ghrelin, insulin, leptin, interleukin 6, adiponectin and testosterone were measured from samples collected between the ages 58 to 70 years.

 

Related to hunger, among other CR-related factors:

 

 

 

To further expand our understanding of these findings, we compared a group of participants in the BLSA who survived to greater than 90 years of age with a group of participants who died between ages 72–76 years. All BLSA participants considered had been evaluated in their 60s, were healthy, and at that time had donated a blood sample. Based on previous literature, we selected potential biomarkers that are known to be affected by CR and are involved in energy homeostasis and lipid metabolism, including the following (with ↑ indicating CR-related increase and ↓ indicating CR-related decrease) insulin ↓ (12), ghrelin ↑ (13), leptin ↓ (14–16) and adiponectin ↑ (17). In addition, we also examined interleukin (IL)-6 ↓, which is a biomarker of inflammation as well as testosterone ↑, which is a biomarker of anabolic hormone, because they are associated with aging, age-related conditions and mortality (16, 18–21). Consequently, the aim of this study was to compare the level of above mentioned individual biomarkers between long- and short-lived healthy men and women.

 

Although they only used a small number of subjects, so they needed to used a panel of markers to achieve significance, the results did show ghrelin was higher and leptin was lower in the long-lived subjects who survived into their 90s versus the ones who died younger as in:

 

 

logo-hhspa.png

Table 1

Subject Characteristics at Age 58–70 Years Among Short-lived and Long-lived Participants.

  Short-lived (Died at Age 72–76) Long-lived (Survived to Age 90) P Subjects, n (W/M) 31 (4/27) 41 (2/39)     Mean (SD) Mean (SD)   Age at Evaluation, years 63.9(3.2) 64.9 (3.5) 0.29 Age Death/Censor, years 73.5 (1.2) 92.4 (2.6) <.001 Weight, kg 77.3 (9.3) 75.7 (8.3) 0.43 Height, cm 174.1 (6.5) 173.7 (5.9) 0.78 BMI, kg/m2 25.5 (2.7) 25.1 (2.4) 0.48 Waist circumference, cm 90.3 (10.0) 87.1 (6.4) 0.25 Systolic BP, mmHg 139.5 (19.7) 130.1 (18.7) 0.05 Diastolic BP, mmHg 85.1 (12.9) 80.1 (11.9) 0.10 Heart Rate, beats/minute 76.2 (11.4) 72.7 (8.5) 0.17 Fasting Plasma Glucose, mg/dL 101.7 (11.1) 100.9 (7.7) 0.75 Physical activity, MET*min 371.6 (225.0) 368.4 (353) 0.97 Fat free mass, kg 55.2 (7.5) 55.4 (5.4) 0.94 Biomarkers       Ghrelin, ρg/mL 99.8 (68.5) 102.3 (72.6) 0.91 Insulin, μU/mL 5.4 (3.8) 5.1 (4.2) 0.75 Leptin, ng/mL 26.6 (38.4) 15.0 (12.0) 0.11 IL-6, pg/mL 2.2 (1.4) 2.2 (2.2) 0.95 Adiponectin, μg/mL 14.7 (8.6) 12.2 (6.6) 0.21 Testosterone, ng/mL 5.5 (5.6) 5.9 (5.5) 0.83 Cause of Death, %     0.006  Cardiovascular 39 24    Cancer 23 5    Other 39 41    Unknown 0 22    Alive 0 7  
 
Edited by AlPater

Share this post


Link to post
Share on other sites

Thanks Al,

 

It doesn't appear that you actually posted the reference to the Baltimore Longitudinal Study of Aging paper you seemed to be talking about. I've included it below [1].

 

Sorry but I also really had trouble grokking the excerpts you included. So I've taken the liberty of extracting one of the graphics from [1], to help illustrate what I think you were trying to say. First, here are the demographics of folks where were short-lived (72-76 at time of death) or long-lived (90+ at time of death):

 

fs0tHm5.png

 

As you can see, they didn't measure NPY, the 'hunger hormone' that Michael, Tom and Dr. Speakman consider most important for the HH. The other hunger hormone the HH proponents point to is ghrelin, which was virtually identical (p = 0.91!) between the short-lived and long-lived BLSA participants. I wouldn't consider that strong (or even weak) support for the HH. The only thing that was significantly associated with living longer in this group was (systolic) blood pressure. I sure hope Michael isn't going to try to claim this is a result of elevated NPY...

 

Yes, you are right that leptin was almost significantly lower in the long-lived folks, but lower leptin could simply be a surrogate marker of less (visceral) fat, which nobody is claiming is healthy or longevity promoting. Quite the contrary in fact, higher abdominal obesity (and therefore very likely higher leptin) is associated with increased metabolic syndrome in these same BLSA folks [2]. We're not talking about fatness vs. leanness here - we're talking about fiber and the HH as outlined by its proponents. Notice that testosterone was if anything slightly higher at baseline in the folks that made it past 90, and fasting insulin wasn't significantly lower... Nor was their BMI significantly lower (25.5 vs. 25.1) ...

 

But back to fiber. There is interesting data about the association of fiber and longevity in these BLSA folks.  The researchers in [3] measured the baseline diet of BLSA participants over a 7-day period and then followed them for 18 years to see which of them died, and what they'd be eating. From the free full text, here is what they found:

 

Survivors had significantly greater intakes of [fruits and vegetables], dietary fiber, magnesium, and vitamin C, and lower intakes of saturated fat than those who died from CHD or other causes, and higher intake of folate than those who died from CHD.

 

Even more astonishing, they found that:

 

 In fully adjusted models ... dietary fiber [showed] a 6% reduction [in mortality risk] / g (P < 0.05).

 
In other words, each extra gram per day of fiber someone in the BLSA cohort ate during the 7-day baseline period was associated with a 6% lower likelihood of dying during the 18y follow-up period. Further, the beneficial fiber effect was largely associated with the kind of fiber we eat most around here - the fiber in fruits and veggies.
 
So rather than being the exception which I think you were trying to suggest, it appears to me that BLSA folks prove the rule - namely that fiber, particularly when packaged as lots of fruits and veggies, is a health and longevity-promoting dietary component. It's not the "CR benefits buster" that one interpretation of the HH would suggest.
 
In short - keep on eating those fiber-rich, water-rich and therefore satiating fruits and veggies. They're the best. Hunger hypothesis be damned...
 
Anyone else care to speak up for the hunger hypothesis?
 
[Note - I heard from HH-advocate TomB in a private message. He's very busy at the moment but promises to respond soon. I think (and hope) he meant respond to this thread. I don't even hope anymore for a response from Michael to anything I write. Even my hand-crafted Bat Signal fails to arouse him from his dogmatic slumber. I now have the attitude that when posts from Michael happen they are like lightning strikes, or manna from heaven ☺]
 
--Dean
 

-----------

[1]  Aging Clin Exp Res. 2011 Apr;23(2):153-8.

 
Relationship between plasma ghrelin, insulin, leptin, interleukin 6, adiponectin,
testosterone and longevity in the Baltimore Longitudinal Study of Aging.
 
Stenholm S(1), Metter EJ, Roth GS, Ingram DK, Mattison JA, Taub DD, Ferrucci L.
 
Author information: 
(1)National Institute on Aging, Clinical Research Branch, Longitudinal Studies
Section, Baltimore, MD 21225, USA.
 
 
BACKGROUND AND AIMS: Caloric restriction (CR) is the most robust and reproducible
intervention for slowing aging, and maintaining health and vitality in animals.
Previous studies found that CR is associated with changes in specific biomarkers 
in monkeys that were also associated with reduced risk of mortality in healthy
men. In this study we examine the association between other potential biomarkers 
related to CR and extended lifespan in healthy humans.
METHODS: Based on the Baltimore Longitudinal Study of Aging, "long-lived"
participants who survived to at least 90 years of age (n=41, cases) were compared
with "short-lived" participants who died between 72-76 years of age (n=31,
controls) in the nested case control study. Circulating levels of ghrelin,
insulin, leptin, interleukin 6, adiponectin and testosterone were measured from
samples collected between the ages 58 to 70 years. Baseline differences between
groups were examined with t-test or Wilcoxon test, and mixed effects general
linear model was used for a logistic model to differentiate the two groups with
multiple measurements on some subjects.
RESULTS: At the time of biomarkers evaluation (58-70 yrs), none of the single
biomarker levels was significantly different between the two groups. However,
after combining information from multiple biomarkers by adding the z-transformed 
values, the global score differentiated the long- and short-lived participants
(p=0.05).
CONCLUSIONS: In their sixties, long-lived and short-lived individuals do not
differ in biomarkers that have been associated with CR in animals. However,
difference between the groups was only obtained when multiple biomarker
dysregulation was considered.
 
PMCID: PMC3180822

PMID: 21743292 

 

------------

[2] J Gerontol A Biol Sci Med Sci. 2009 May;64(5):590-8. doi: 10.1093/gerona/glp004. 

Epub 2009 Mar 6.
 
Longitudinal paths to the metabolic syndrome: can the incidence of the metabolic 
syndrome be predicted? The Baltimore Longitudinal Study of Aging.
 
Scuteri A(1), Morrell CH, Najjar SS, Muller D, Andres R, Ferrucci L, Lakatta EG.
 
Author information: 
(1)UO Geriatria, Istituto Nazionale Ricovero e Cura per Anziani, Rome, 00189,
Italy. angeloelefante@interfree.it
 
OBJECTIVE: To determine the predictors of incidence of metabolic syndrome (MetS) 
(Adult Treatment Panel III criteria) and to determine if longitudinal changes in 
specific MetS components differ by age or gender in participants who developed
versus those who did not develop MetS.
METHODS: A total of 506 men and 461 women (baseline age 52.4 +/- 17.5 years) from
the Baltimore Longitudinal Study on Aging (BLSA) were followed longitudinally (at
least two study visits), and censored when they developed the MetS or reported
use of antihypertensive or lipid-lowering medications.
RESULTS: After a follow-up period of 6 years, the incidence of the MetS was 25.5%
in men and 14.8% in women. As many as 66% of men and 73% of women with one or two
altered MetS components at baseline did not develop the MetS. Predictors of
developing MetS were higher baseline abdominal obesity or triglycerides and lower
high-density lipoprotein cholesterol (area under receiver-operated curve [AUC] = 
0.84 in men, 0.88 in women). Addition of the rate of changes in MetS components
over time slightly improved predictive accuracy (AUC = 0.94 in men, 0.92 in
women). Men were more likely than women to have the MetS without obesity, whereas
women were more likely than men to have the MetS without an altered glucose
metabolism.
CONCLUSIONS: The patterns of MetS components and the longitudinal changes that
lead to the MetS are different in men and women. Interestingly, components with
the highest prevalence prior to MetS development, such as elevated blood
pressure, are not necessarily the stronger risk factors.
 
PMCID: PMC4017826
PMID: 19270183
 
---------
[3] J Nutr. 2005 Mar;135(3):556-61.
 
The combination of high fruit and vegetable and low saturated fat intakes is more
protective against mortality in aging men than is either alone: the Baltimore
Longitudinal Study of Aging.
 
Tucker KL(1), Hallfrisch J, Qiao N, Muller D, Andres R, Fleg JL; Baltimore
Longitudinal Study of Aging.
 
Author information: 
(1)Jean Mayer USDA Human Nutrition Research Center on Aging, Tufts University,
Boston, MA, USA. katharine.tucker@tufts.edu
 
 
Saturated fat (SF) intake contributes to the risk of coronary heart disease (CHD)
mortality. Recently, the protective effects of fruit and vegetable (FV) intake on
both CHD and all-cause mortality were documented. However, individuals consuming 
more FV may be displacing higher-fat foods. Therefore, we investigated the
individual and combined effects of FV and SF consumption on total and CHD
mortality among 501 initially healthy men in the Baltimore Longitudinal Study of 
Aging (BLSA). Over a mean 18 y of follow-up, 7-d diet records were taken at 1-7
visits. Cause of death was ascertained from death certificates, hospital records,
and autopsy data. After adjustment for age, total energy intake, BMI, smoking,
alcohol use, dietary supplements, and physical activity score, FV and SF intakes 
were individually associated with lower all-cause and CHD mortality (P < 0.05).
When both FV and SF were included in the same model, associations of each were
attenuated with CHD mortality, and no longer significant for all-cause mortality.
Men consuming the combination of > or =5 servings of FV/d and < or =12% energy
from SF were 31% less likely to die of any cause (P < 0.05), and 76% less likely 
to die from CHD (P < 0.001), relative to those consuming < 5 FV and >12% SF. Men 
consuming either low SF or high FV, but not both, did not have a significantly
lower risk of total mortality; but did have 64-67% lower risk of CHD mortality (P
< 0.05) relative to those doing neither. These results confirm the protective
effects of low SF and high FV intake against CHD mortality. In addition, they
extend these findings by demonstrating that the combination of both behaviors is 
more protective than either alone, suggesting that their beneficial effects are
mediated by different mechanisms.
 
PMID: 15735093

Share this post


Link to post
Share on other sites

Before anyone jumps on me for the small differences in fiber intake between the BLSA long-lived (21.8g/day of fiber) vs short-lived (16.8g/day of fiber) men (PMID: 15735093), or a similarly small difference in fiber intake between the "healthy agers" (29g/day of fiber) vs. "suboptimal agers" (27.7g/day of fiber) or people who died (26.4g/day of fiber) in the study I opened this thread with (PMID 27252308) I'll do it for you. Yes, those are small deltas in fiber intake. But in both studies the association between higher fiber intake and longer/better life was highly significant.

 

However I will acknowledge that neither study probably bears too much on the hunger hypothesis, since those relatively low fiber intakes, and small differences in fiber intake, probably has little impact on satiety/hunger.

 

But I stand by the vegan Adventist data. They eat twice as much fiber as all these folks (46g/day - certainly enough to reduce their hunger) and live the longest of all on average. Even comparing within different Adventist subpopulations, we see a nice correlation between fiber intake and reduced mortality over a significant range of fiber consumption. The vegan Adventists eat 50% more fiber than the omnivorous Adventists (46g vs. 30g), and had a 15% reduced mortality rate over about 6 years of follow-up in [1].

 

--Dean

 

--------------

[1] JAMA Intern Med.. 2013 Jul 8;173(13):1230-8.. doi: 10.1001/jamainternmed.2013.6473.

Vegetarian dietary patterns and mortality in Adventist Health Study 2.
 
Orlich MJ(1), Singh PN, Sabaté J, Jaceldo-Siegl K, Fan J, Knutsen S, Beeson WL,
Fraser GE.
 
Author information:
(1)School of Public Health, Loma Linda University, Loma Linda, CA 92350, USA.
morlich@llu.edu
 
Comment in
JAMA Intern Med.. 2014 Jan;174(1):168-9.
JAMA Intern Med.. 2014 Jan;174(1):169.
JAMA Intern Med.. 2013 Jul 8;173(13):1238-9.
Dtsch Med Wochenschr.. 2013 Sep;138(39):1930.
 
IMPORTANCE: Some evidence suggests vegetarian dietary patterns may be associated
with reduced mortality, but the relationship is not well established.
OBJECTIVE: To evaluate the association between vegetarian dietary patterns and
mortality.
DESIGN: Prospective cohort study; mortality analysis by Cox proportional hazards
regression, controlling for important demographic and lifestyle confounders.
SETTING: Adventist Health Study 2 (AHS-2), a large North American cohort.
PARTICIPANTS: A total of 96,469 Seventh-day Adventist men and women recruited
between 2002 and 2007, from which an analytic sample of 73,308 participants
remained after exclusions.
EXPOSURES: Diet was assessed at baseline by a quantitative food frequency
questionnaire and categorized into 5 dietary patterns: nonvegetarian,
semi-vegetarian, pesco-vegetarian, lacto-ovo-vegetarian, and vegan.
MAIN OUTCOME AND MEASURE: The relationship between vegetarian dietary patterns
and all-cause and cause-specific mortality; deaths through 2009 were identified
from the National Death Index.
RESULTS: There were 2570 deaths among 73,308 participants during a mean follow-up
time of 5.79 years.. The mortality rate was 6.05 (95% CI, 5.82-6.29) deaths per
1000 person-years.. The adjusted hazard ratio (HR) for all-cause mortality in all
vegetarians combined vs nonvegetarians was 0.88 (95% CI, 0.80-0.97).. The adjusted
HR for all-cause mortality in vegans was 0.85 (95% CI, 0.73-1.01); in
lacto-ovo-vegetarians, 0.91 (95% CI, 0.82-1.00); in pesco-vegetarians, 0.81 (95%
CI, 0.69-0.94); and in semi-vegetarians, 0.92 (95% CI, 0.75-1.13) compared with
nonvegetarians.. Significant associations with vegetarian diets were detected for
cardiovascular mortality, noncardiovascular noncancer mortality, renal mortality,
and endocrine mortality.. Associations in men were larger and more often
significant than were those in women.
CONCLUSIONS AND RELEVANCE: Vegetarian diets are associated with lower all-cause
mortality and with some reductions in cause-specific mortality.. Results appeared
to be more robust in males.. These favorable associations should be considered
carefully by those offering dietary guidance.
 
PMCID: PMC4191896
PMID: 23836264

Share this post


Link to post
Share on other sites

Hi Dean!

 

IMO, the thing that stands out in the table that you published is the difference in WAIST CIRCUMFERENCE -- much better for the longer lived group.  IMO, you and others tend to overemphasize BMI, which is a crude tool -- a much more useful statistic is waist/hip ratio.

 

Also, about the HH -- at least  at CR IX, the speaker who emphasized this was Dr. Miller, not Dr. Speakman -- although I had the highest regard for the opinions of both speakers.  (I also found Dr. J's description of the irregularities in the NIAA monkey study very interesting.)

 

It was a great conference.  (The only turkey was XXXXX.)

 

  :)xyz

 

  --  Saul

Edited by Dean Pomerleau
Edited to delete name out of respect for all CR Conference speakers. Sorry Saul, that was neither a cool thing to say nor was it relevant to this conversation. --Dean

Share this post


Link to post
Share on other sites

Hi Saul, welcome back and thanks for dropping by to share your thoughts (although see my edit to your post above - sorry about that but it's not cool, appropriate or relevant to this conversation to insult any of the CR Conference speakers who generously gave their time to be with us and present their research).

 

Back to the topic of this thread, fiber and the Hunger Hypothesis (HH).

 

You wrote:

Also, about the HH -- at least  at CR IX, the speaker who emphasized this was Dr. Miller, not Dr. Speakman -- although I had the highest regard for the opinions of both speakers.  (I also found Dr. J's description of the irregularities in the NIAA monkey study very interesting.)

 

I took 3-pages of careful notes during Dr. Miller's invited talk, and upon reviewing them, I can't find any evidence of him saying anything directly related to the HH. In contrast, Dr. Speakman spoke directly (although not convincingly, IMO) about his argument for the HH during his session, including his "calorie dilution" hypothesis and the importance of NPY.

 

In fact, the research most relevant to the HH that I could find Dr. Miller spoke about appears to me to undermine the HH, if anything. In particular, he spoke of the ability of acarbose to extend lifespan, at least in male mice, based on his published work [1]. Why is acarbose relevant to the HH? Because (in his words) "it blocks the chopping up of complex carbs into glucose". This helps with glucose control, which is why is sometimes used to treat diabetes. While relatively safe for humans, acarbose slows digestion, can cause weight loss due to malabsorption and diarrhea, and among several other unpleasant side effects, it causes bloating and loss of appetite. Together these can lead to reduced calorie absorption, weight loss and a reduction in hunger. If something causes both a net reduction in calorie absorption and a reduction in hunger yet nevertheless extends lifespan, that would seem to speak against the HH, at least indirectly, since it shows (unintentional) CR without hunger increases longevity. 

 

Knowing you and knowing the thoroughness of my own notes, I suspect you are perhaps thinking of off-the-cuff remarks Dr. Miller may have made that were relevant to the HH during the Q&A panel session at the end of the conference. I didn't take notes during that session. 

 

I'm not sure if you've noticed, I tend to heavily discount casual remarks that people make with little thought behind them and without evidence to back them up .

 

That is what I consider comments made by people (even respected scientists like Dr. Miller) during a 'shoot the sh*t' session like that panel discussion. Such sessions enable researchers to "let their hair down" and are therefore a great opportunity to probe them about their intuitions and hunches. But their remarks in such a context need to be taken with a big grain of salt, a point I'm sure all of them would emphasize if asked. So even if Dr. Miller spoke in favor of the HH in his casual remarks during the panel, I wouldn't put much weight on it, unless he provided an argument and/or data to back it up, which I don't seem to recall that he did - do you or anyone else who attended?

 

BTW Saul, not that it matters much, but what is your personal perspective on fiber and the HH?

 

--Dean

 

------------

[1]  Aging Cell. 2014 Apr;13(2):273-82. doi: 10.1111/acel.12170. Epub 2013 Nov 19.

 
Acarbose, 17-α-estradiol, and nordihydroguaiaretic acid extend mouse lifespan
preferentially in males.
 
Harrison DE(1), Strong R, Allison DB, Ames BN, Astle CM, Atamna H, Fernandez E,
Flurkey K, Javors MA, Nadon NL, Nelson JF, Pletcher S, Simpkins JW, Smith D,
Wilkinson JE, Miller RA.
 
Author information: 
(1)The Jackson Laboratory, Bar Harbor, ME, 04609, USA.
 
Four agents--acarbose (ACA), 17-α-estradiol (EST), nordihydroguaiaretic acid
(NDGA), and methylene blue (MB)--were evaluated for lifespan effects in
genetically heterogeneous mice tested at three sites. Acarbose increased male
median lifespan by 22% (P < 0.0001), but increased female median lifespan by only
5% (P = 0.01). This sexual dimorphism in ACA lifespan effect could not be
explained by differences in effects on weight. Maximum lifespan (90th percentile)
increased 11% (P < 0.001) in males and 9% (P = 0.001) in females. EST increased
male median lifespan by 12% (P = 0.002), but did not lead to a significant effect
on maximum lifespan. The benefits of EST were much stronger at one test site than
at the other two and were not explained by effects on body weight. EST did not
alter female lifespan. NDGA increased male median lifespan by 8-10% at three
different doses, with P-values ranging from 0.04 to 0.005. Females did not show a
lifespan benefit from NDGA, even at a dose that produced blood levels similar to 
those in males, which did show a strong lifespan benefit. MB did not alter median
lifespan of males or females, but did produce a small, statistically significant 
(6%, P = 0.004) increase in female maximum lifespan. These results provide new
pharmacological models for exploring processes that regulate the timing of aging 
and late-life diseases, and in particular for testing hypotheses about sexual
dimorphism in aging and health.
 
© 2013 The Authors. Aging Cell published by the Anatomical Society and John Wiley
& Sons Ltd.
 
PMCID: PMC3954939
PMID: 24245565

Share this post


Link to post
Share on other sites

Hi Dean!

 

Actually, during his talk, Dr. Miller did mention the hunger hypothesis.  I asked him a question to clarify it -- he responded to my question:  "If you aren't hungry, then you aren't on CR".  I noted lower body temperature (I didn't mention bloodwork markers) -- but Dr. Miller (and Michael Rae, who was sitting in the seat next to me) said, "If you aren't hungry, then you aren't on CR".

 

What's my opinion?  I don't know; certainly, this is not one of Luigi's hypotheses.

 

I think it would be nice to make a test:  E.g., have two sets of long-lived mice of the same breed and sex -- both fed the similar 30% CRAN diets, group one without extra fiber, group 2 with ad lib fiber (the right fiber would be important -- one needs a form that won't be made significantly digestible by gut bacteria).  This is the sort of test that, e.g. Dr. Spindler's group could make.

 

Instead of mice, one might be able to do a study with flies -- that would have results sooner.  (But we're more closely related to mice). 

 

Of course, one test would not be a proof; as Dr. Miller pointed out, at least 3 such tests, at different labs, using the same protocol, might be required, before (Dr. Miller) would be convinced.

 

  --  Saul 

Share this post


Link to post
Share on other sites

Thanks Saul,

 

I'll certainly could have missed that comment by Dr. Miller about the need for hunger. Do you (or anyone) remember the context in which he mentioned the HH, or any evidence he gave to back up his statement? Or was it just an off-the-cuff remark?

 

The "calorie dilution" experiment you describe would be an interesting way to test the HH directly, at least in mice. It is sort of like the Solon-Biet study (PMID 24606899) I started this thread with, and that Dr. Speakman criticized, except it would keep calories and macronutrient intake the same and just add indigestible fiber to one group's diet to increase their satiety while leaving the other group hungry, to see which group lived longer.

 

--Dean

Share this post


Link to post
Share on other sites

Yes; I hope that someone makes such a study.  (Fruit flies would be faster -- but might be less convincing as a surrogate for humans.  The Buck Institute for Aging, which we visited for CR VIII about 3 years ago, has brilliant researchers, several of which use fruit flies .)

 

  --  Saul

Share this post


Link to post
Share on other sites

In general, there is a sad lack of studies involving the effects of a really high fiber diet on health (to say nothing of longevity). But one I forgot to mention in my opening remarks in this thread was the 'Near Perfect Diet' study [1], which I discussed in detail here.  The vegan arm of that protocol included nearly 140g of fiber per day. Here is a post I just made to the LongeCity thread on the benefits of fiber about this study and its implications. My short summary from that post is:

 

This high fiber vegan diet was dramatically more beneficial for CVD risk factors than the other two diets tested, which actually contained a respectable, but much lower amount of fiber (25g and 47g).

 
So there appear to be further benefits to be had way beyond a respectable fiber intake of 47g with a healthy plant-based diet, which is interesting because 47g is virtually identical to the 46g the average, long-lived, vegan Adventist eats.

 

--Dean

 

----------

[1] Metabolism. 2001 Apr;50(4):494-503.

 
Effect of a very-high-fiber vegetable, fruit, and nut diet on serum lipids and
colonic function.
 
Jenkins DJ(1), Kendall CW, Popovich DG, Vidgen E, Mehling CC, Vuksan V, Ransom
TP, Rao AV, Rosenberg-Zand R, Tariq N, Corey P, Jones PJ, Raeini M, Story JA,
Furumoto EJ, Illingworth DR, Pappu AS, Connelly PW.
 
 
We tested the effects of feeding a diet very high in fiber from fruit and
vegetables. The levels fed were those, which had originally inspired the dietary 
fiber hypothesis related to colon cancer and heart disease prevention and also
may have been eaten early in human evolution. Ten healthy volunteers each took 3 
metabolic diets of 2 weeks duration. The diets were: high-vegetable, fruit, and
nut (very-high-fiber, 55 g/1,000 kcal); starch-based containing cereals and
legumes (early agricultural diet); or low-fat (contemporary therapeutic diet).
All diets were intended to be weight-maintaining (mean intake, 2,577 kcal/d).
Compared with the starch-based and low-fat diets, the high-fiber vegetable diet
resulted in the largest reduction in low-density lipoprotein (LDL) cholesterol
(33% +/- 4%, P <.001) and the greatest fecal bile acid output (1.13 +/- 0.30 g/d,
P =.002), fecal bulk (906 +/- 130 g/d, P <.001), and fecal short-chain fatty acid
outputs (78 +/- 13 mmol/d, P <.001). Nevertheless, due to the increase in fecal
bulk, the actual concentrations of fecal bile acids were lowest on the vegetable 
diet (1.2 mg/g wet weight, P =.002). Maximum lipid reductions occurred within 1
week. Urinary mevalonic acid excretion increased (P =.036) on the high-vegetable 
diet reflecting large fecal steroid losses. We conclude that very high-vegetable 
fiber intakes reduce risk factors for cardiovascular disease and possibly colon
cancer. Vegetable and fruit fibers therefore warrant further detailed
investigation.
 

PMID: 11288049

Share this post


Link to post
Share on other sites

“Anti-CR” - caveat: there is an argument to be had as to whether humans, monkeys, dogs, and even rats/mice (Dean) experience a CR-response at all, apart from obesity avoidance (and perhaps not even then, going by the ob/ob mice example). 

 

We can define the CR-response as occurring when an organism functions under the conditions of a deficit in the supply of calories. This would be distinct from a net deficit (i.e. supply minus expenditure, as in the case of exercise). 

 

However, even this is a squishy concept, because CR’d organisms expend energy too - so it’s been suggested that the CR-response happens when you take in fewer calories than you would if you were a normal weight ad lib. But that quickly lends itself to problems - what is “normal” weight? Is BMI a relevant measurement, body composition as a factor in BMI etc. 

 

Even in Walford’s original T120yd it’s something vague along the lines of high school weight if you weren’t overweight which has the distinct whiff of a circular begging the question. 

 

What is “normal” weight for any given species? I don’t think this is trivial, as it is tied in to the concept of ad lib consumption, and as we’ve seen repeatedly, including in the two monkey studies, the answers to these questions will guide the design of a study: is it ad lib when you let the monkeys get obese (Wisconsin)? Or do you prevent obesity by restricting calories yet still call it “ad lib” (NIH)? How can not letting monkeys eat freely be called “ad lib”? 

 

The response might be that in most lab conditions animals eat out of boredom, not true “ad lib”, and therefore it’s justified to restrict them to “normal” weight even if we need to refine the definition of ad lib to “when not bored” - and it can even be backed up by studies showing animals in enriched environments live longer than non-enriched lab conditions, as they are then not eating out of boredom. In this scenario, true ad lib describes intake of food driven solely by hunger and satiety, without additional factors of boredom, stress relief, competition and other lab-derived confounders. 

 

But even so, what is “normal” weight for any given species? Do we even know that for humans? After all, haven’t we been shifting the BMI ranges between “underweight”, “overweight”, and “obese”? Clearly there is the skeletal unhealthy weight (f.ex. anorexia) - but what is healthy? Is there a specific and very precise number? Again, these are not niggling trivial questions and pedantry, because relatively small shifts in BMI have definite health consequences on a population level. If we can’t even answer such basic questions, then one must ask: what is CR and the CR-response? Does it even exist? Just what have scientists been studying all these decades (at least since the 1930’s) and written those tens and tens of thousands of papers on all those studies? I'll try to bring some clarity to these questions below.

 

The HH is a hypothesis, and I’m guided here by two considerations.

 

The first consideration is evolutionary. A number of scientists have made evolutionary-related arguments about the CR-response, particularly as it applies to humans; the latter famously by Aubrey de Grey. Various mechanisms were proposed (Weather Hypothesis), and I believe our very own MR has put out a paper (PMC2464717). 

 

To address the evolutionary argument, let us note that what we are calling the CR-response has been present - pretty robustly - from very early on evolutionarily speaking (f.ex. yeast) and preserved pretty consistently in almost all species studied. For any such biological pathway to be preserved, it must have had some evolutionary advantage - and (this being how evolution works) been passed on in reproduction. 

 

It must also mean that the conditions that reward that pathway have persisted - or else the pathway would have been eliminated. 

 

Periodic Food Scarcity qualifies as such a persistent condition. It affects all species (that we know of). It has persisted as a frequent condition since life began. Therefore, adaptations that deal with it must be present in all species from early on till today. Those organisms that did not have the adaptation or that did not pass that adaptation on, died out, were evolutionarily eliminated. The CR-response allows the organism to outlive the period of scarcity in order to then reproduce later. Therefore, it naturally would have resulted in a longer lifespan - immediate reproduction not being an option. From this one can establish that (1) organisms have evolved adaptations to survive periods of food scarcity, and (2) these adaptations naturally result in a longer lifespan when invoked vs when not invoked (not invoked in periods of plenty).

 

What would the counter-argument be: The phenomenon of Periodic Food Scarcity does not exist? Organisms have not evolved any capacity to deal with such a widespread, persistent and life/death-consequential phenomenon? Neither of these seem plausible.

 

How would the longer life be obtained? Various fitness mechanisms would become stronger, and calories re-directed from reproduction to survival. 

 

This response is fundamentally different from a “net deficit” situation. In times of food scarcity, the supply of calories is limited. Take a CRONie vs Sports Guy. Both might operate at a relative caloric deficit. Both might be non-obese. Both are “healthy”. But only one (the CRONie) experiences supply-limited calories - whereas the deficit of the Sports Guy is not driven by insufficient supply, but by large expenditure. Therefore in a period of food scarcity, the Sports Guy would be at a survival disadvantage, since you can’t expend your way to more calories during a drought/famine. Evolution favors the CR-response vs the “net deficit”. The CR-response is preserved evolutionarily vs the Sports Guy.

 

So far, it seems therefore: (A) A life-prolonging CR-response as distinct from net-calorie deficit exists, i.e. it is the result of a supply deficit and it is evolutionarily preserved.

 

How does the HH tie into CR? During the CR-response period, physiological processes are redirected from reproduction to survival - which makes sense. However, while the organism is under the CR-response regimen, why should it experience hunger? Why shouldn’t it instead experience no hunger, and rather achieve satiety under the same low-calorie conditions? Why is hunger an obligatory feature of the CR-response? 

 

The answer: reproduction - the only means of species perpetuation, and therefore also a guarantee of the preservation of the biological pathway that allows such reproduction - the CR-response.

 

What is hunger? Hunger is a compulsion to seek out more calories (to alleviate compelling discomfort). In the absence of hunger there is no reason for an organism to expend effort to obtain more calories - no mechanism compels it to do so. 

 

Let us look at two scenarios. One: an organism that survives on low calories and has no sensation of hunger beyond those low calories (in Dean’s terms: is satiated on low calories). An organism like that can therefore survive for its entire lifespan on those low calories regardless of the availability of calories - in times of famine and in times of plenty. If it experiences no hunger beyond the low calories it consumes during times of famine, then what reason would it have to up its calorie intake in times of plenty? None. 

 

Now contrast that with an organism that is identical, except it experiences hunger on those low calories - a hunger it can not satisfy on low calories during the entire time of famine, but it can satisfy that by upping - the now available - calories in times of plenty. 

 

What will be the evolutionary outcome of those two scenarios? Only one will reproduce (during a time of plenty). It is the one with a sense of hunger while on low calories - and not the one with no hunger beyond the low calories. The one that experiences hunger will in times of plenty up its calories to eliminate hunger and with higher calories it will switch back from maintenance to reproduction, and since it will reproduce it will pass on its CR Plus Hunger model. One passes on its model (CR Plus Hunger), one does not (CR Minus Hunger).

 

Therefore we establish (B) that the CR-response must incorporate Hunger in order to be evolutionarily preserved, it is indeed a necessary condition.

 

This also allows us to roughly pinpoint at which moment the organism experiences the CR-response: when the caloric deficit has reached a level at which the adaptation kicks in, a defining component of which is switching away from reproduction to survival. For example, in humans it would be amenorrhea for the female and elimination of libido for male and other physiological adaptations. If you have not experienced that, you are not having a CR-response.

 

Furthermore, reproduction ties into a longer life in a paradoxical fashion. Once an organism has reproduced, and passed on its genes, there is no further evolutionary interest in keeping the organism alive (the premise of R.D’s The Selfish Gene). This has been somewhat modified within a species-wide setting with the “grandmother hypothesis” - if the offspring is particularly helpless and needs support for a long time, the mother organism is evolutionarily pressured to stick around, and even beyond its own offspring it would favor the group right into grandmother years long past reproduction. 

 

This is where reproduction enters into the CR-response paradigm. Why does the CR-response necessarily involve turning away from reproduction and into survival? Because a time of famine and food scarcity would not be an opportune time to bring offspring into the world, something that is energy intensive. There is no energy to spare, and the result of such reproduction would fail to thrive. So it makes sense to switch away from reproduction into maintenance and survival. 

 

Secondly, this ties into life prolongation. The only organisms that carry the CR-response into the next generation are the ones that survive and reproduce - but in order to reproduce and take care of the next generation the organism must stick around in good enough health to accomplish that - past the point where the other organism has already reproduced because there was no famine for it - and so, necessarily the CR’d animal lives longer. The aging in the CR’d animal has been slowed down or delayed until it can again reproduce and care for offspring and the genetic legacy. The non-CR’d animal does not need to delay its reproduction and hence its aging.

 

To sum up A and B: a CR-response prolongs life, it is the result of a deficit in the supply of calories, the deficit is to the point of redirection from reproduction, and must be accompanied by hunger in the evolutionary perspective.

 

[Having lost this post in a browser crash, grr, I’m going to break it up into additional posts as I write. Also apologies for the length of this text, but hopefully breaking it up will somewhat alleviate the word-flood.]

Edited by TomBAvoider

Share this post


Link to post
Share on other sites

The CR-response in lab animal studies and comparative diet studies in humans.


 


Lab animals


 


No question the CR-response that results in a longer lifespan might be obliterated through any number of conditions: genetic models of organisms that can only exist in artificial lab conditions and are not the result of evolutionary development (rats/mice commonly referred to as ‘genetic f’ups’ around here), pharmacological interventions, peculiar unusual diets that are lab-derived and not encountered in an environment that was subject to evolutionary pressure, other artificial non-evolutionary lab conditions such as unnatural (to the species) temperatures (Dean), boredom, lack/presence of pathogens, hunger etc.


 


Dean points to the relative failure of evoking the CR-response of prolonged lifespan in lab animals, the most prominent being the NIH monkeys. But that requires a fair test. If you want to evoke the evolutionarily preserved CR-response, you need to present the animals with the same conditions that would have evoked that response in the wild. And therefore one can have any number of questions wrt. the NIH monkeys (that will also apply to varying degrees to CR studies in other species such as rodents).


 


Did the NIH monkeys experience ravenous hunger? No? If no, then you have just eliminated a fundamental requirement for the CR-response to kick in (per my argument above that Hunger is a necessary condition of the CR-response). So too for the rest: were the calories cut back far enough to have the monkeys switch away from reproduction (elimination of estrus in the females)? Were they affected by unnatural interventions (f.ex. military pharmacological experiments on NIH monkeys)? Were they housed in unusual temperatures (compared to the wild)? Did they experience unnatural lab conditions, such as boredom, non-natural light/dark cycles, unusual troop interactions? And so on for many other factors.


 


One of the ways in which scientists have tried to eliminate the “unnatural genetic lab freak” effect (f.ex. the ob/ob mice) was to switch to wild rodents. But while that may take care of the genetic aspect, it does nothing for all the other lab-derived CR-response eliminators.


 


Bottom line, I don’t know if we actually have many clear CR-response studies out there, and bringing up failures of the response in the lab is likely not a valid test of the CR-response. In order to study and test CR-response animals, you actually have to have CR-response animals without the CR-response being eliminated by lab conditions and confounders.


 


Perhaps the CR-response does not apply in monkeys and humans. But the NIH experiment, in my mind, does not resolve the issue for all the reasons above.


 


The argument (such as made by Aubrey de Grey among others), that CR in monkeys or other longer-lived species like humans would only give 2-3 years, is not sufficiently grounded. Have monkeys - and humans - in their evolutionary past experienced periods of famine? Undoubtedly yes. Would they therefore have been compelled to develop the same adaptations to survive the famine as rodents would? Undoubtedly yes. 


 


In fact, AdeG acknowledges as much, by not claiming that the CR response doesn’t happen for humans - how could he? After all, it doesn’t take much mathematical modeling to show that if periods of famine exist, those individuals that are better suited (through exhibiting the CR-response) will survive and pass on their genes, whereas those that don’t exhibit such an adaptation would perish and be eliminated from the gene pool. Those that claim no CR-response exists in monkeys and humans (or longer-lived species in general) need to supply arguments as to how the same evolutionary forces and environmental conditions that act on other species (and manifestly apply to monkeys and humans: reproduction, survival, hunger and famine) don’t act and have the same effect on longer-lived species such as monkeys and humans. Monkeys (and humans) have starved (on a population level), they both need to reproduce, if they don’t survive starvation they don’t get to reproduce etc. So far, we have not seen any such convincing contrary arguments.


 


Human diets


 


The problem with trying to compare CR’d humans with non-CR’d healthy (and the non-CR’d healthy might be: high fiber, 7th Day Adventist, obese-avoidant etc.) is that we need to identify the CR’d cohort first.


 


Dean brings up the case of Okinawans (as CR’d) vs 7th Day Adventist (Loma Linda Blue Zone) and finds the latter doing better. But as argued above, the Okinawans would not actually qualify as CR’d as I understand it from an evolutionary point of view. The evidence is quite clear - for one, for the CR response to kick in, you need to experience hunger. It is not clear to me if Okinawans experience ravenous hunger. But let us ignore the hunger criterion for the sake of argument. The other criterion is to supply limit the calories to a degree sufficient to re-direct from reproduction to survival. I think it is pretty clear that the Okinawans don’t experience that level of food deprivation voluntarily - there are no reports of Okinawans desisting temporarily from reproduction, experiencing amenorrhea, complete loss of libido etc.


 


One could perhaps posit that cutting calories substantially, yet short of amenorrhea and non-reproductive status should already exhibit the benefits of the CR-response, and so the Okinawans are fairly classified as experiencing a CR-response. I would argue that is absolutely wrong for the following reason: the CR-response is an on-off switch. As the old joke goes: you either are pregnant, or you’re not, there is no “a bit pregnant”. You either reach the level of food deprivation mimicking the evolutionary signature of the CR-response, or you don’t. As argued above, switching away from reproduction to survival is a signature of the CR-response. Restricting calories by itself does not trigger the CR-response as we’ve identified it above - you need to reach a specific level (temporary loss of reproductive capability) to trigger the effect.


 


Cutting calories to avoid obesity is no doubt healthful, but unless cut to the levels of redirection from reproduction, it is not a CR-response, and the Okinawans demonstrably don’t reach those levels. In that scenario, perhaps the 7th Day Adventists do better than Okinawans, and massive-fiber consumers do better than non-massive fiber consumers, Okinawan, or otherwise. But none of them are being compared to true CR-response cohorts.


 


In those studies cited by Dean, you are basically comparing different levels of obesity avoidance diets - and not any of them to a CR-response diet. Even the NIH monkey study has made allowances for restricting calories in the controls - and not claiming thereby that it amounts to “CR” for the controls. Just because you are restricting calories does not mean you are evoking the CR-response, you might indeed simply be avoiding obesity and getting thinner to various degrees, and therefore not surprisingly finding no life extension “compared to obesity avoiding diets”… well duh, as the kids would say. Restricting the controls to non-obese levels was a good start in the NIH study - the problem was that unless the intervention monkeys (1) experienced substantial hunger (2) exhibited reproduction-redirection (loss of estrus etc.), then they too were merely thinner and more calorie restricted but not reaching the point of CR-response being evoked.


 


Consequently, because the subjects were not CR-respondent (nor avoiding CR-respondent lab-artifact CR-effect eliminators such as temperature etc.) none of the studies Dean cites qualify as testing the HH either. They were - at best - testing hunger in the obesity avoidance diets (and not the CR-respondent diet). Obesity avoidance and any class of healthy diets (including vegetable based, high fiber, etc.) will all have health benefits. Some diets and calorie levels may have comparatively bigger effects among themselves. But unless the caloric level drops to reproduction-redirection levels, the CR-response pathway switch has not been tripped, and you are not testing the HH in the context of the CR-response. Hunger is along a spectrum. Hunger in the absence of CR-response level of restricting calories may by itself not exhibit any particular advantage/disadvantage from health/longevity point of view.


 


The idea behind the HH (in my view of what constitutes the HH) is that if you reach CR-response level of food restriction (re-direction from reproduction), you must also experience hunger in order to reap the full benefits of the CR-response. Btw. this is my own conception of what HH is, and not necessarily in agreement with anybody else, like Dr. Speakman or whoever. The grounds for my conception has been given in the preceding arguments.


 


The importance of the calorie restriction reaching the level of reproduction-redirection should not be underestimated. Let us note, that just shutting down the reproductive system, all by itself, can result in an expanded health/lifespan as many studies show in animals that undergo spaying/neutering, eunuch studies etc.. A non-surgical way of reaching the same shutting down of the reproductive system, through caloric restriction, should already therefore exhibit health/longevity benefits apart from any other factors. Were the NIH monkeys - and the various humans on mild CR - deprived of this proven benefit of shutting down reproductive pathways? If yes, then perhaps it is even less surprising that they did not exhibit any lifespan extension.


 


For obvious reasons, it may not be possible to find populations that voluntarily subject themselves to the CR-response. There are anecdotal cases, such as the oldest male currently alive at 112 is a survivor of both the ghetto starvation and the Auschwitz concentration camp, but a longitudinal study of holocaust survivors (PMID: 18194229) have not found any mortality difference from non-holocaust survivors and holocaust survivors seemed to have less social support, lower levels of physical activity and greater psychiatric disorders. Obviously there are too many confounders for camp survivors to be a good case of CR-response study.


 


Bottom line: there are no studies, clinical or otherwise proving, or, crucially, *disproving* the Hunger Hypothesis, and in absence of such, it remains a hypothesis. There are however strong evolutionary arguments in favor of the HH and no arguments - that I am aware of - against it.


 


[more]

Share this post


Link to post
Share on other sites

Absolute vs. Net Calorie Shortfall - Which Matters for CR Benefits?

 

Tom,

 

Thanks for your thoughtful replies. Sorry you lost your draft to a browser crash. That is always a concern of mine too, since it is something the auto save recovery feature won't save you from. Here is how I avoid that frustrating problem of potentially losing posts due to a crashed browser or computer.

 

In broad brush, your two posts seem to be trying to address three important but independent topics, only one of which seems directly relevant to the topic of this thread - fiber and the hunger hypothesis. They are:

  1. Will CR work (in Humans)? - Is there even such a thing as the magical "CR response", and is it preserved and significant in humans? Or put another way, will serious CR beat a healthy obesity-avoiding diet and lifestyle when it comes to promoting human longevity. There's a thread for that ☺.
  2. Absolute vs. Net Calorie Restriction? - Assuming there is something special we call the CR response, how is it triggered? In particular, is it triggered by a deficit in absolute calorie intake relative to some hard-to-define baseline, or it triggered by a relative or net calorie shortfall (i.e. too few calories "left over" at the end of the day to support growth & reproduction after subtracting off for basal metabolism and physical activity)?
  3. Is Hunger Necessary? - Assuming there is such a thing as the CR response and we can figure out whether it requires an absolute or net calorie deficit, there is still the question of whether or not hunger per se has anything to do with triggering the CR response.

As I said, there is a thread devoted to #1, so I'll try to avoid addressing it here. You don't say very much about it anyway, except that we both agree that it is an important open question.

 

It's really #3 that is the focus of this thread, but because so much of your discussion focuses on #2 (particularly in your first post) and because #2 is so near and dear to own my heart, I will try to respond to it, as well as to #3. In fact, this entire post will be about #2, and I'll address #3 in my next post - for anyone who only cares about the HH.

 

First, a high level observation.

 

Your entire argument seems to rest on a 'just so' story told in your first post about the evolution of the so-called CR response. It appears to be a story based on your own intuitions about how you think evolution must have worked to produce and conserve the CR response. You don't appear to provide any scientific evidence in either of these two posts to support the evolutionary ideas you present, and especially and most critically in this context, none to support the notion that hunger is critical for triggering the CR response.

 

The absence of supporting scientific evidence for the story in general, and the HH in particular, wouldn't be so bad if the story you told was airtight - if the logic of your argument was validity and therefore you conclusion was undeniable. What I'll try to show is that this too is far from the case. In particular, I'll try to show that:

  1. It makes more sense from an evolutionary, biochemical and experimental evidence perspective for the CR response to be triggered by a net calorie deficit as opposed to an absolute calorie deficit. 
  2. Nowhere in your evolutionary story is their any place where a logical argument is, or seems to me can be made that hunger is necessary to trigger the CR response. And as far as I can tell you don't give any scientific evidence to support your argument for the HH.

 

I think a good place to start is with the final summary statement in your first post:

 

To sum up A and B: a CR-response prolongs life, it is the result of a deficit in the supply of calories, the deficit is to the point of redirection from reproduction, and must be accompanied by hunger in the evolutionary perspective.

 

As we both agree, there is reason to question the first clause ("a CR-response prolongs life") but let's leave that aside for this thread, since it's an active and contentious topic of discussion over here.

 

I also have no quibbles about this statement "the [calorie] deficit is to the point of redirection from reproduction". Based on the evidence, it seems to me pretty clear that the CR response involves a shift of resources away from reproduction to maintenance and repair. The alternative, namely the "wear and tear" explanation for CR benefits, doesn't not seem to have garnered significant experimental support over the years. However I'm surprised you chose to argue in favor of an absolute calorie deficit interpretation of CR without resorting to the wear and tear explanation. It seems to me the wear and tear explanation is the only halfway tenable angle for arguing it's the absolute and not net calorie deficit that counts. But I'm getting ahead of myself.

 

My real criticism of your argument stems from the other two clauses, which I'll address in turn in this post and the next.

 

Absolute vs. Net Calorie Shortfall

 

Your first statement I'd like to challenge is "[The CR response] is the result of a deficit in the supply of calories".

 

While I myself use it quite frequently, the use of the term "deficit" in the context of CR has always bothered me.

 

Lifelong CR can't result in a permanent calorie "deficit" as I think the term is commonly understood - if so the animal would waste away to nothing and die. I think a better term than deficit is shortfall. When subjected to CR, organisms have a shortfall in calories, in that they have fewer calorie at their disposal than their metabolism would ultimately "like" (forgive my teleological-speak) in order to simultaneously support all the metabolic processes it has evolved to engage in - e.g. basic stuff like contracting your heart to pump your blood, replacing red blood cells, fight foreign invaders with newly generated immune cells, contract skeletal muscles to chase down food or a mate, produce new sperm cells in males, or supporting the metabolic costs of menstruation and pregnancy in females, etc.

 

When calories are in short supply, the body senses this through a variety of pathways, and chooses to allocate its limited resources to metabolic functions that are critical for survival. This is the essence of the so-called "CR response" - when the body senses a calorie shortfall, it allocates its scarce energy resources away from growth and reproduction towards maintenance, repair and I'll add, acquiring food, as the best evolutionary strategy to survive the famine and live to procreate another day, when resources are more plentiful.

 

So the big question here is "does the body sense a calorie shortfall relative to some pre-existing, absolute calorie intake that would allow it to completely support each of its metabolic "goals" simultaneously?" as you seem to argue. Or is it like I contend - namely that at the end of the day nutrient sensing pathways sense whether there remain calories available for discretionary processes (like reproduction) after basal metabolic needs have been met, and after food has been gathered and consumed. 

 

My contention is that sensing of a net calorie shortfall is far more plausible given what we know about biology than sensing a shortfall relative to some absolute calorie level. 

 

You yourself acknowledge the subjective nature of trying to identify some absolute calorie baseline by which to judge whether an organism is experiencing CR. Metabolic requirements vary based on body size, age, time of the month and pregnancy status for women, etc. Metabolic rates vary from person to person depending on genetics (remember the "constitutionally lean" women with more BAT?). Ambient temperature and the need for homeotherms to generate heat to stay warm varies from one season to the next, and one climate to the next. Digestive efficiency varies depending on the ratio of macronutrients consumed, the amount of fiber consumed, one's gut microbiome. The energy need to acquire the basic necessities of life (food, water, shelter) and avoid predation vary from season-to-season, climate-to-climate and from one social circumstance to another (part of a pack or tribe vs. loner). 

 

Given all this variability, do you really think the body has some absolute calorie counter somewhere, ticking off how many calories have been consumed today and ready to kick the body into "CR mode" the moment it determines the day's calorie intake has fall short of some predefined, absolute threshold it also has squirrelled away somewhere?

 

Forgive me, but that sounds like how a computer might be programmed to count calories using counters, registers and fixed thresholds, but it seems highly implausible for a biological organism. It just doesn't seem to me that the body works that way, with static counters and fixed thresholds.

 

Let me see if I can paint a more plausible picture which will bring us back around to the real topic of this thread, the hunger hypothesis.

 

The first distinction. Rather than waiting to trigger some the CR response until some hypothetical fixed time point when a judgment is made as to whether the absolute calorie threshold has been met for the day, the model I posit, based on a mountain of scientific evidence, is continuous. The body is constantly sensing it's energy balance through a multitude of pathways, some evolutionarily ancient (like levels of ATP vs. ADP, NAD+ vs. NADH, AMPK, SIRT1 etc) and some relatively modern (like circulating glucose, insulin, FFAs, triglycerides, adipokines like leptin).

 

The relative activity in all these energy sensing pathways continuously and locally determine whether the organism (or really, the cell's local environment) is in a state of energy surplus or shortfall. The organism (actually the organism's cells) use the state of these energy sensitive signalling molecules to adjust its allocation of resources and orchestrate metabolic processes. Here is a nice graphic from [1] illustrating this kind of continuous orchestration of metabolic processes based on energy sensing in just one of these pathways (SIRT1):

 

 

NAD-15.jpg

 

Despite what the diagram says on the right, SIRT1 expression is not triggered by an abstract concept of "Calorie Restriction", to say nothing of falling short of some absolute daily calorie intake level. Instead, SIRT1 expression is mechanistically triggered continuously and largely independently in each cell based on the current relative level of intracellular NAD+ (vs. NADH). In turn SIRT1 orchestrates a wide range of metabolic responses that virtually define the CR response. 

 

Here comes the important part relative to the question of absolute vs. net calorie restriction.

 

SIRT1 doesn't and can't know why intracellular NAD+ is elevated relative to NADH. It doesn't care and can't know whether NAD+ is elevated because of an absolute calorie shortfall (e.g. the organism simply failed to find food today), or because the organism had to spend extra calories to get the same amount of calories (e.g. had to roam further than usual today to forage or hunt it's meal down, or even because it was cold out and so the organism had to burn extra calories to generate heat while foraging). 

 

Instead, a cell simply knows that energy is scarce right here, right now, based on NAD+ level and so it expresses SIRT1 to initiate many aspects of the CR response. And this isn't simply a 'just so' story. This is how the CR response is actually triggered in organisms, based on evidence.

 

Before moving on to criticize your just so story about the HH (in my next post) based on these multiple energy sensing pathways, lets look at your argument to support the absolute calorie perspective to make sure I haven't missed anything.

 

You wrote:

 

This [CR] response [to famine] is fundamentally different from a “net deficit” situation. In times of food scarcity, the supply of calories is limited. Take a CRONie vs Sports Guy. 

 

 I'll address the pitiful strawman argument you make regarding "Sports Guy" shortly. 

 

But first, let's look at this sentence carefully "In times of food scarcity, the supply of calories is limited." It sounds good. Almost profound.

 

But in fact, while that sentence is true by definition (in fact it's a tautology), what it says is irrelevant to an individual organism, and undetectable and meaningless to individual cells where the real decisions are made.

 

During times of food scarcity, the total supply of calories in the environment is indeed limited, as you suggest. But what is operative from an individual organism's perspective is effort required to obtain calories vs the calories those efforts produce.

 

In short, even in times of extreme food scarcity, an organism that works hard enough can virtually always obtain any given number of calories from its environment. It's just that garnering those calories becomes increasingly difficult, and energetically costly, to the point where the additional calories obtained are fewer than the calories required to obtain them. So for example, an extra hour of hunting or foraging in the winter may produce an additional 200 kcal of food, but require 220 kcal of energy expenditure to engage in. That is obviously a losing proposition, at least long-term. 

 

We see behavior related to this all the time in CR animals, which paradoxically increase their energy expenditure relative to ad lib controls, especially around meal time, even when the extra activity doesn't garner them any extra calories. In fact, it's so common in both animals and anorexic humans that it has its own name - starvation-induced hyperactivity [2]. Animals are willing to spend more energy (often in vain) in order to get more calories when food is scarce, contributing further to their net calorie shortfall and forcing them to divert energy away from other metabolic processes like growth and reproduction so that it can be spent trying to get more food.

 

Here we see quite explicitly that a physical activity-induced increase in the organism's net calorie shortfall due to it's extra foraging to get the food it needs to survive, helps shift the continuous balance the body maintains between it's various metabolic processes away from growth and reproduction - i.e. the so-called CR response.

 

So while your tautology "In times of food scarcity, the supply of calories is limited" is undoubtedly true on one level, this by no means supports your apparent contention that the CR response is triggered by some mysterious ability of the organism (to say nothing of individual cells where these decisions ultimate get made!) to sense the total number of calories it's been able to obtain during the day, to say nothing of the absolute amount of food available in the external environment!

 

In short, the CR response, is ultimately triggered in individual cells by local sensing of local signals (like NAD+ level) reflective of the available energy. It has nothing to do with how many absolute calories are available in the environment during famine conditions, or even how many absolute calories the organism is able to acquire. 

 

Which brings me to your laughable "Sports Guy" argument.

 

You said (my emphasis):

Take a CRONie vs Sports Guy. Both might operate at a relative caloric deficit. Both might be non-obese. Both are “healthy”. But only one (the CRONie) experiences supply-limited calories - whereas the deficit of the Sports Guy is not driven by insufficient supply, but by large expenditure. Therefore in a period of food scarcity, the Sports Guy would be at a survival disadvantage, since you can’t expend your way to more calories during a drought/famine.

 

Do you actually believe that last statement I've highlighted?  Do you realize how silly that statement is? Of course you can expend your way to more calories during a drought/famine. How about expending extra calories by walking to the next watering hole for a drink and in hopes of finding food their during a drought?

 

In another example of expending extra calories when calories are scares, that is exactly what happens in the winter all the time. Animals have to work hard to forage for food, and stay warm while doing it via thermogenesis, expending many more calories to meet their nutritional needs. If the winter-induced famine continues longer than normal, do you really think the bodies of the desperate and hard-working animals won't trigger the CR response, simply because they may be getting the same absolute number of calories in the winter as during the salad-days of summer (pun intended), but are forced to expend a ton more calories in the winter to get them? - i.e. they have a large net calorie deficit but no absolute calorie deficit relative to their summertime 'baseline' calorie requirements?

 

If you believe hard-working animals starving in winter due to a net (but not 'absolute') calorie deficit won't trigger the CR response, than we may have to agree to disagree. But you should be aware that this is exactly what happens in the successful CR rodent experiments in the lab. CR rodent are almost always kept at standard lab temperatures in individual cages, which are the equivalent of winter-like conditions for them, as we've discussed several times on the cold exposure thread. And the CR rodents are more active than their ad lib brethren, particularly at night and in the several hours leading up to feeding time - contributing to an additional calorie deficit relative to AL. And, obviously, these conditions are the canonical (perhaps only) conditions in which the so-called CR response is observed.

 

So contra your "Sports Guy" strawman, CR animals can and do expend more calories to get food, and nevertheless do benefit from the CR response. In fact, as I've pointed out several times on the cold exposure thread, expending extra calories to stay warm may even be required for the CR response to kick in - 'cushy' CR (at thermoneutral temperatures) eliminates the CR response in rodents, despite the fact that the short-lived comfortably-warm CR rodents eat even fewer calories than the long-lived chilly CR rodents, and so by your absolute CR definition should be more CRed than the chilly CR rodents.

 

In short, I have a real hard time seeing:

  1. Why, from an evolutionary perspective, expending extra energy in physical activity would or should be excluded from the energy balance calculation the body engages in to trigger the CR response, when animals in the wild and in the lab often expend extra calories in an (albeit sometimes vain) attempt to get food when food is scarce, thereby contributing further to their net calorie deficit. Are you saying that animals that work hard to get food, expending calories in the process, shouldn't (and don't) benefit from the CR response? What evolutionary argument could you possibly make for this perspective?
  2. Even if the body somehow wanted to exclude calories expended in physical activity when determine whether to trigger the CR response, how such an exclusion would even be possible given the fact that the decision to trigger the CR response is made at the cellular level, and cells have no idea about the cause of local net energy deficit they sense in their intracellular microenvironment via signals like the ratio of NAD+ to NADH in order to trigger the CR response.
  3. Even if there was an evolutionary reason to exclude physical activity from CR response calculations, and even if there was a mechanism by which cells could implement such an exclusion policy, what evidence there is that cells or whole organisms actually do exclude physical activity when calculating whether to trigger the CR response as a result of a calorie shortfall? If anything, it seems the evidence points the other way.

 

Which finally brings me back to the real topic of this thread - fiber and the hunger hypothesis. But rather than address the part of your 'just so' story involving the HH in this post, I think I'll follow you lead and keep this post (relatively) short. 

 

Thanks Tom for prompting me to think through and articulate my thoughts around the absolute vs. net calorie question. I'll be curious what your or others think about this line of reasoning in favor of the net calorie shortfall as the trigger for the CR response.

 

--Dean

 

-----------

[1] Li X, Kazgan N. Mammalian Sirtuins and Energy Metabolism. Int J Biol Sci 2011; 7(5):575-587. doi:10.7150/ijbs.7.575. Available from http://www.ijbs.com/v07p0575.htm

 

--------

[2] Neurosci Biobehav Rev. 1993 Fall;17(3):287-94.

 
Starvation-induced hyperactivity in the rat: the role of endocrine and
neurotransmitter changes.
 
Pirke KM(1), Broocks A, Wilckens T, Marquard R, Schweiger U.
 
Author information: 
(1)Department of Psychoendocrinology, University of Trier, Germany.
 
Semistarved rats develop high running wheel activity. This running activity
induces increased norepinephrine, dopamine, and serotonin turnover in the
hypothalamus. Corticosterone in plasma becomes increased while luteinizing
hormone and testosterone are suppressed. In female rats cyclic gonadal function
is suppressed. Running activity in the semistarved rats can be suppressed
specifically by serotonin 1-c-agonists and by alpha 2-adrenoceptor agonists. This
animal model is helpful in the understanding of the combined effects of
starvation and hyperactivity, which are observed in many patients with anorexia
nervosa. The observation of the serotonergic system might help to develop a
pharmacological treatment of hyperactivity in anorectic patients.
 
PMID: 7903806

Share this post


Link to post
Share on other sites

Hi Dean

 

As usual, I agree with most (but not all) of what you said  -- especially in what you said at the beginning of your post, in response to Tom's two posts.  I don't believe that calories expended by excercise "don't count"; and I'm very dubious about your "cold" hypotheses (but DEFINITELY do not want to get into a discussion of either point -- tempus fugit. :)xyz).

 

I strongly feel that we should all avoid quoting the NIAA monkey study -- as Dr. J. said, it was very flawed -- new monkeys introduced at various times, different species from entirely different geographical areas, etc.  (The Wisconsin study was of course flawed, in that the control monkeys weren't moderately CR reduced, so the study was really an anti-obesity study rather than a CR study -- and in that the diet was too heavy in sugar.  BUT the Wisconsin study DID compare one species of monkey in both groups, without additional monkeys being added at random points to groups.)  The simple fact is, the monkey studies give is little useful information.  (The NIAA study even failed to note the beneficial effects of CR on health parameters that Luigi's studies have found in humans -- and we're more closely related to humans [i hope] than to monkeys :rolleyes:.)

 

Oh well, I'm sure that the endless hypothesizing will go on indefinitely (until maybe some more information is available).

 

On the subject of the HH:  I believe that the benefits of fiber are real. But, please note: Supposedly "inert" fiber that we eat is NOT contributing zero calories -- gut microbes -- which vary enormously in people -- partially digest some fibers (depending on the fibre), usually into short chain fatty acids which we can digest -- and that equals some calories.  he only way to test the true number of calories consumed that I can think of is "the old Dean way" -- calories in minus calories out -- measured you-know-how.   :)xyz   [Not many people are going to be anxious to do that -- but you COULD do it in a GOOD primate study -- one of the speakers at CR IX mentioned a short lived small Chinese primate similar to a lemur.  OR, in rodents -- or even in flies.]

 

  --  Saul 

Share this post


Link to post
Share on other sites

Hi Saul,

 

I agree once again that the NIH Primate CR studies had flaws.

 

I figured you'd be skeptical of the hunger hypothesis, given all the fiber you eat in your low fat, high vegetable diet.

 

[T]he only way to test the true number of calories consumed that I can think of is "the old Dean way" -- calories in minus calories out -- measured you-know-how.    :)xyz  

 

For those of you too young to remember what Saul is referring to, the "old Dean way" of measuring the real number of calories one is assimilating (rather than simply consuming) involves bomb calorimetry of both food and fecal samples - hence "calories in" - "calories literally out". That was a fun experiment. And people think I'm crazy nowadays... ☺

 

--Dean

Share this post


Link to post
Share on other sites

Is Hunger Necessary? - My Response to TomB

 

This is part 2 in my response to TomB about his argument in favor of the Hunger Hypothesis (HH) - that the experience of hunger is necessary in order to trigger the "CR response", so by implication, a high-fiber, hunger-avoiding diet is a bad idea.

 

Recall in my last post, I addressed Tom's argument that some measure of absolute calorie shortfall (i.e. calorie intake - 'some magic constant') is required for the CR response to be activated, rather than a net calorie shortfall (calorie intake - calorie expenditure). That point is ultimately irrelevant for the hunger hypothesis, since it simply requires a calorie shortfall as a starting point, whether it be absolute or net.

 

In the interest of keeping these post about Tom's HH argument (relatively) short, I'll focus on his first post regarding the HH, and leave his second post (which is interesting as well but only loosely related to the HH) for a subsequent response. And in lieu of diving into the details of Tom's post with lots of quotes, let me make a high level comment about and analysis of your argument. If I mischaracterize it I'm sure you will let me know, but I've read over your posts several times so I think it accurately captures what you're saying...

 

It appears you believe that hunger is the primary or perhaps only pathway by which an organism (and importantly, its cells) detect that it is experiencing a calorie shortfall. You also seem to think that hunger is the only way an organism experiencing a calorie shortfall can be motivated to seek food and to trigger the CR response. In short, you appear to have this high-level model of hunger, CR, survival and reproductive fitness in mind:

 

gUQXXg7.png

While theoretically this is not an entirely unreasonable model, you yourself hint at its downfall/shortcomings, when you say:

 

 Therefore, adaptations that deal with [Periodic Food Scarcity] must be present in all species from early on till today. 

 

You are perfectly correct in this statement - it does appear that the CR response is preserved across a wide range of species, at least short-lived species (it remains to be seen if it persists in long-lived species like humans, as we both acknowledge). The trouble this fact poses for your argument is that hunger as we know it (i.e. that unpleasant feeling we associate with an empty stomach, and which is suppressed by eating a high fiber, low GI diet) is an evolutionarily very recent invention. Do you think yeast cells, C. Elegans worms, or even fruit flies experience this sort of subjective hunger? 

 

Even food seeking behavior doesn't require hunger as we know it - simple single cell organisms will move along a glucose gradient to seek and find food.

 

So both food seeking and the CR response can and do occur in the absence of hunger, as we saw in the example from my last post where a net calorie deficit local to a cell causes an increase in the ratio of NAD+ to NADH, which in turn increases the cells expression of SIRT1 which causes many aspects of the CR response to be activated.

 

A much more realistic and plausible model of the pathway from a persistent calorie shortfall to increased survival and improved evolutionary fitness is depicted in this diagram:

QERrjjz.png

 

I apologize for its complexity but even this diagram is far from comprehensive. For example, I didn't even include glucose or insulin levels as indicators of calorie surplus or shortfall.

 

What it shows is that a calorie shortfall effects a bunch of different biochemical processes (e.g. NAD+ / NADH ratio) reflecting the energy state of the organism. These in turn influence energy state signalling molecules like NPY, AMPK, SIRT1, Leptin etc. Some of these (like SIRT1 and AMPK) are evolutionarily ancient and well known for their ability to kick in important components of the CR response. Others (like NPY, Ghrelin and Leptin) are more recent, and influence the subjective sense of hunger (at least in higher organisms) and may directly increase food seeking behavior simply by being present (sans subjective hunger). 

 

The $64K question is whether or not those energy signalling molecules that are relatively recent add-ons associated with hunger (e.g. Leptin, Ghrelin, NPY) also directly influence the CR response (i.e. whether the dotted lines in the above diagram exist), and if so, how important are they relative to the evolutionarily primitive and primary cellular-level signalling pathways from a calorie shortfall to the CR response (e.g. AMPK and SIRT1). 

 

Tom, do you see now why your model of hunger's role in the CR response is too simplistic? Do you see why there is a big open question as to whether the so-called hunger hormones like Leptin, Ghrelin and NPY, to say nothing of the subjective feeling of hunger itself (which may or may not track these hunger hormones) are involved in triggering the life-extending CR metabolic response? 

 

Even assuming there is a link between some or all of the hunger signalling molecules and the CR response, there remains a whole other set of open questions as whether or not various strategies for increasing subjective satiety (e.g. most relevantly for this thread, a high fiber diet) influence the various hunger signalling molecules in the first place, and if so which ones, and are those the ones that influence the CR response.

 

Tom, do you see that HH can't simply be proven by an evolutionary 'just so' story, but instead requires scientific evidence to answer very tangible questions relating to the influence (or lackthereof) of hunger signalling molecules on the CR response?

 

In short, there is some possibility that hunger may be involved in the CR response, at least in complex organisms where the concept of hunger even makes any sense. But the primary drivers of the CR response appears to be evolutionarily ancient energy signalling pathways like SIRT1 and AMPK, which have nothing to do with hunger.

 

So a convincing argument for the Hunger Hypothesis - i.e. that hunger is required for the CR response to kick in, will require a lot more evidence than I've seen to date from you or anyone else.

 

--Dean

Share this post


Link to post
Share on other sites

Tom,

 

I will just briefly address your second post about evidence for the CR response from animal intervention and human observational studies, which tangentially touches on the hunger hypothesis.

 

First, you make quite a big deal about the CR response being very fragile and easily eliminated by any number of conditions and circumstances - to the point where you appear to believe it has only rarely been observed, even in laboratory experiments. You said:

 

Bottom line, I don’t know if we actually have many clear CR-response studies out there, and bringing up failures of the response in the lab is likely not a valid test of the CR-response. In order to study and test CR-response animals, you actually have to have CR-response animals without the CR-response being eliminated by lab conditions and confounders.

 

But the question then becomes - if the CR response is so fragile and hard to reproduce even in controlled laboratory conditions in species much more likely to exhibit it than humans (because of the relative advantage a short-lived species would have if it evolved a CR response), what makes you think:

 

a) the CR response isn't a laboratory artifact, and if not,

b) we can figure how to effectively trigger it in humans, or

c) we can determine whether or not we have triggered it in humans?

 

If very few animals in the lab or in the wild can be observed exhibiting a true CR response, and as you say no human population (not even the Okinawans) has even come close to exhibiting the CR response - how the heck can you think it likely (or even plausible) that it will happen in humans, to say nothing of being able to figure out how to trigger and sustain it?

 

To your credit, you do seem to offer two criteria - hunger and a shift of resources away from reproduction. 

 

Hunger you seem to consider an unequivocal and necessary signal that the CR response is occurring, based on the argument you made in your first post. We saw sounds that argument was... I don't think I need to speak further about just how poor hunger is as an indicator of the CR response.

 

As for the second criteria (resource shift away from reproduction), it seems more plausible. The evidence suggests that CR primates did not experience a shift of resources away from reproduction - at least in terms of commonly accepted measures. From [1] :

 

Questions about the effects of CR on reproductive function frequently arise. In the early years of the Wisconsin study we examined the frequency of menstrual cycles in control and CR female rhesus monkeys and did not find evidence of disrupted cycles in the latter (Kemnitz et al. 1998). The female monkeys on CR in the NIA study also continued to have normal menstrual cycles (Lane et al. 2001). Both CR and control monkeys exhibited lower estradiol and increasing FSH with advancing age, as is typical of the primate aging process. The females in the Wisconsin and NIA studies were not allowed to breed while in the study, so it was not possible to assess the possible effects of long-term CR on fertility and pregnancy. Short-term CR in baboons, however, did not interfere with conception or gestation (up to the surgical delivery of the fetuses; Li et al. 2007).
 
Recent studies of testicular function in young males in the NIA study did not reveal a significant effect of CR on 24-hour plasma testosterone levels or on indicators of semen quality, such as sperm viability and function (Sitzmann et al. 2010a,b).
 
But I will note from personal experience that a shift of resources away from reproduction (as measured by testosterone & libido, for example) does not require an absolute calorie shortfall as you seem to be arguing - a net calorie shortfall is quite sufficient to trigger it, in my experience. Further, the reproduction-suppressing effects of a net calorie shortfall persist despite lack of hunger as a result of eating a high fiber, satiating diet.
 
So if suppression of reproductive-oriented processes and behavior is your "gold standard" you propose for detecting the CR response (since we've ruled out hunger as a marker), it seems to undermine your arguments regarding the two main issues we've been discussing, namely absolute vs. net calorie shortfall and the hunger hypothesis.
 
The next thing I got a chuckle out of which this passage:

[T]he CR-response is an on-off switch. As the old joke goes: you either are pregnant, or you’re not, there is no “a bit pregnant”. You either reach the level of food deprivation mimicking the evolutionary signature of the CR-response, or you don’t. As argued above, switching away from reproduction to survival is a signature of the CR-response. Restricting calories by itself does not trigger the CR-response as we’ve identified it above - you need to reach a specific level (temporary loss of reproductive capability) to trigger the effect.

 

There are so many thing wrong with this paragraph I don't even know where to begin. 
 
Your first statement "the CR-response is an on-off switch" flies in the face of the only evidence for the CR effect that we do have - namely that it appears as a graded effect in rodents - the more the calorie shortfall, the greater the lifespan benefit. Remember the graph from Weindruch et al [1] on the CRS website front page?
 
dbb1279dff6ba56fb1e18e8b22484bc8.gif
 
While I've argued repeated (most recently here) that the benefits of extreme CR are exaggerated and that most of them can be achieved via relatively mild (net) CR in the neighborhood of 20%, nowhere have I argued, nor do I believe, nor does the evidence suggest that CR is like an on-off switch. More CR leads to (marginally, in my reading of the evidence) more lifespan. Nor does the level of (net) CR that seems from the evidence to be the "sweet spot" in terms of hardship vs. benefits (i.e. ~10-20% CR) come close to triggering cessation of reproductive functions in animals or people.
 
Which brings me to the part that really made me laugh - your analogy with pregnancy. I trust at some point in your life you learned about the birds and bees, so you realize that to make a baby a sperm has to come together with an egg. Notice anything about event Tom? Notice it is an event Tom? It is a very specific, localized, punctuated event that determines whether a woman/female gets pregnant or not, and triggers a incredibly huge cascade of other hormonal, metabolic and physiological changes in the woman/female to prepare for and support the growing infant(s). This is nothing like either the cause of a calorie shortfall (which is graduated both temporally and in degree) nor the bodies response to a calorie shortfall, which is also graduated both temporally and in degree.
 
In short, your statement "the CR-response is an on-off switch" appears to have been pulled out of thin air, supported solely by an incredibly silly and inappropriate analogy between CR and pregnancy. Am I missing something?
 
Finally, in one more crazy statement you say:

...you need to reach a specific level (temporary loss of reproductive capability) to trigger the [CR] effect.

 

This statement could perhaps be justified in women and females from other species that stop experiencing estrus/menstruation as a result of severe CR. But males? Really? Do you really think that men on serious CR exhibit "temporary loss of reproductive capability"?
 
If any human I know is on serious long-term CR, it's Michael. And according to his long-time (former?) love® April Smith, Michael definitely had definitely not lost reproductive capacity circa 2007 at least, as the following excerpt from page 5 of this article attests (my emphasis):
 
“Before CR, I was, if anything, hornier than most men,” says Michael. “But some people find that when they go on very severe CR, their classic male libido—that sort of aaaargh-there’s-a-pretty-woman-I-can’t-stop-my-neck-from-moving libido—goes down.” And Michael, it turns out, is one of those people.
 
“I’ve often thought that if you could explain to women that on CR, men will improve their sexual performance but decrease their skirt-chasing behavior so they only have eyes for you, who they’re in love with, women would be like, ‘I’m cutting your calories, honey. Half your dinner tomorrow,’” April resumes. “A 35-year-old who is mature and is interested in a deep spiritual experience but can fuck like an 18-year-old? That’s a pretty good thing.”
 
Of course these statements may be all hyperbole and bravado. Only Michael and April know the truth...
 
But if we take Michael's and April's statements at face value (coupled with my own personal experiences, BTW), it seems that while the desire to engage in reproductive pursuits diminishes with CR (to the point of being completely suppressed in some instances...), the ability to engage in reproductive functions remains intact in men, and, at least in Michael's case, apparently continues to be actively exercised1.
 
Here is the summary of my three response posts to Tom's contributions to this thread. Tom's arguments for:
  • the CR response being an all-or-nothing affair,
  • complete suppression of reproductive capacity being a good (or the only valid) indicator of CR,
  • the hunger hypothesis, and
  • defining CR in terms of absolute rather than net calorie shortfall

have more holes in them than you can shake a stick at. Pun intended.

 
--Dean
 
1Perhaps this lurid discussion of Michael's sex life will finally provoke him to join our conversation to defend (or deny) his manhood. One can only hope... ☺.
 
--------
[1] Weindruch R, et al. (1986). "The retardation of aging in mice by dietary restriction: longevity, cancer, immunity and lifetime energy intake." Journal of Nutrition, April, 116(4), pages 641-54.
 

Share this post


Link to post
Share on other sites

First: doltish me, I only realized a couple of weeks ago who "TomBAvoider" was, having previously failed even after reading the many spot-on comments consistent with his prior pattern. Welcome back, Tom: thanks very much for engaging again.
 
Second: there is way, way too much stuff in this thread that I wish I (or someone) could well address, much of it going back to positions Dean has argued with extensive and clear argumentation in a variety of previous threads and evidence but that is none the less fatally flawed in fundamental ways, but don't have time or brain power to spare to demonstrate any of them despite having at least initially started doing so in draft responses to those earlier threads: since Dean is not only nagging me now incessantly to comment, but even "fighting dirty" to try to rule me in, I will simply proclaim ex cathedra that:
 
-Dean is totally mistaken in his interpretation of the Adventists-vs.-Okies issue, even after you take into account the basic fact that he's comparing completely different kinds of studies across different cultures;
 
-Dean is oversimplifying a bit about net energy deficit being equivalent to CR, and argues in a crazy circle to maintain the plausibility of a simple equivalence;
 
-Tom is totally wrong about the fragility of the CR response, it being extremely robust and some of the alleged failures (like the wild-derived mice) even actually quite successful, and others being single experimental failures for obvious reasons being trumped up as evidence for such fragility despite having been successful in multiple cases when done right in the same obvious way;
 
On a less clear-cut question: I'm not sure, but it may be that Tom and Dean are simply talking at cross-purposes re: the on/off switch of CR. The CR effect is clearly not an on/off switch in the simple binary sense that Dean reasonably understands by the term, precisely because of the graduated response that he outlines. However, the CR response does seem to be something similar to an on/off switch, in the sense of a trigger or threshold effect: one does seem to have to reach a certain level of CR to lead to slower aging: the effects of energy balance — and in particular, of the "Calories in" portion of that balance — on the diseases of aging and on "aging itself" is not simply a continuum with obesity on the one end and weep-on-your-knees CR on the other: rather, obesity-avoidance (via reduced/controlled energy intake or increased exercise) avoids segmental and supernumerary aging damage, but only a certain minumum level of CR proper retards the aging process.
 
Complicating matters, however — and returning to Dean's simplification of the "net Calories" question — it does seem from PMID 9049716 ((1) — discussed in a thread from the CR Society Listserv from 2001, involving myself, Dean and Saul, and alluded to here by Dean) that once the trigger has been pulled, additional energy deficits from exercise can further enhance the effect, creating a post-trigger "net Calories" effect or something close to it, even though (in the same paper and elsewhere) an energy-induced energy deficit ("energy-out") can't itself activate the trigger in the absence of restriction of Calories ("energy-in").
 
-Certainly there is significant evidence that the CR primates did not experience a significant shift of resources away from reproduction, and also did not experience significant hunger or consistency of CR: that's a significant part of why it still seems reasonable to think that CR might work in rhesus monkeys, barring which it would be surprising if it turned out to work in humans.
 
On a specific point related to that:
 

 

you need to reach a specific level (temporary loss of reproductive capability) to trigger the [CR] effect.

 
This statement could perhaps be justified in women and females from other species that stop experiencing estrus/menstruation as a result of severe CR. But males? Really? Do you really think that men on serious CR exhibit "temporary loss of reproductive capability"?
 
If any human I know is on serious long-term CR, it's Michael. And according to his long-time (former?) love® April Smith, Michael definitely had definitely not lost reproductive capacity circa 2007 at least, as the following excerpt from page 5 of this article attests [...] it seems that while the desire to engage in reproductive pursuits diminishes with CR (to the point of being completely suppressed in some instances...), the ability to engage in reproductive functions remains intact in men

 


I'd say that there is some ambiguity in the very broad language of "reproductive ability" here, and that you (Dean) are overextending an overly-broad and nonspecific phrase as it would apply to the idea of a resource shift from "reproduction" to maintenance. Desire for sex, reproductive ability, and erectile function are not simply the same thing, although it's of course difficult to reproduce with zero erection and no technological facilitation. CR males — rodent or human — do exhibit a shift in resources from aspects of reproductive ability — notably, sperm count and what we might call "reproduction-seeking activity" (even when a willing female is available (I have paper copies of such studies, discussed in the Archives, but not to hand) — and in rodents (and we hope and have some reason to believe humans) of a corresponding increase in investment in maintenance (tho' I believe that the main anti-aging effects of CR are due to reduced generation of damage and of energy-conserving processes as much or more as of an active increase in maintenance investments).
 
However, this is quite separate from erectile function, which is largely dependent on endothelial function, which latter is very well-documented to be maintained or enhanced by CR during aging* (so that you might get better erectile function "for free" with the more generalized effect — and certainly doesn't come at any great energetic cost). This is in evidence in multiple reports of superior erectile function in CR rodents (2-5) in a dose-dependent fashion,(4) and of anecdotal accounts of equal or superior function in human CR males. And somewhat surprisingly, (7) found that "Neither long-term alcohol ingestion nor caloric restriction were associated with major decrements in copulatory behavior", tho' unsurprisingly "Long-term alcohol ingestion was associated with decrements in erectile function ex copula [ie, directly, as in response to drug stimulus]". I presume this is again a matter of better response maintenance during aging; either way, it's not going to lead to superior reproduction, because CR animals' sperm counts will be lower in youth and fertility is likely very low IAC in age even if the rate of decline is slower.
 
Additionally, humans develop atherosclerosis, which WT rodents do not; in our cohort, and also in historical famine populations, CR retards and even partially reverses aspects of atherosclerosis, which is presumably also contributing to better maintenance of erectile function with age, even in the presence of low reproductive activity due to withdrawal of investment into making new sperm cells.
 
So, I would say that it's fair to say that CR leads to a shift of resources away from reproduction sensu stricto in rodents and humans, despite better maintaining erectile function with age.

 * (As opposed to the acute effects of starvation/fasting/food withdrawal, which unsurprisingly lead to an acute loss of response to chemical and other stimuli as the animal seeks out food (eg. (7))
 
References
1. Holloszy JO.
Mortality rate and longevity of food-restricted exercising male rats: a reevaluation.
J Appl Physiol. 1997 Feb;82(2):399-403.
PMID: 9049716 [PubMed - indexed for MEDLINE]
 
2: Ozbek E, Simsek A, Ozbek M, Somay A. Caloric restriction increases internal iliac artery and penil nitric oxide synthase expression in rat: comparison of aged and adult rats. Arch Ital Urol Androl. 2013 Sep 26;85(3):113-7. doi: 10.4081/aiua.2013.3.113. PubMed PMID: 24085231.

3: Tomada I, Fernandes D, Guimarães JT, Almeida H, Neves D. Energy restriction ameliorates metabolic syndrome-induced cavernous tissue structural modifications in aged rats. Age (Dordr). 2013 Oct;35(5):1721-39. doi: 10.1007/s11357-012-9473-z. Epub 2012 Sep 26. PubMed PMID: 23010986; PubMed Central PMCID: PMC3776100.

4: Maio MT, Hannan JL, Komolova M, Adams MA. Caloric restriction prevents visceral adipose tissue accumulation and maintains erectile function in aging rats. J Sex Med. 2012 Sep;9(9):2273-83. doi: 10.1111/j.1743-6109.2012.02681.x. Epub 2012 Mar 16. PubMed PMID: 22429455.
 
5: Hannan JL, Heaton JP, Adams MA. Recovery of erectile function in aging hypertensive and normotensive rats using exercise and caloric restriction. J Sex Med. 2007 Jul;4(4 Pt 1):886-97. PubMed PMID: 17627736.

6: Alvarenga TA, Andersen ML, Papale LA, Tufik S. Effects of long-term food restriction on genital reflexes in paradoxically sleep-deprived male rats. Brain Res. 2006 Oct 18;1115(1):148-54. Epub 2006 Aug 30. PubMed PMID: 16938279.

7: Clark JT, Keaton AK, Sahu A, Kalra SP, Mahajan SC, Gudger JN. Neuropeptide Y (NPY) levels in alcoholic and food restricted male rats: implications for site selective function. Regul Pept. 1998 Sep 25;75-76:335-45. PubMed PMID: 9802427.

Share this post


Link to post
Share on other sites

Michael,

 

Thanks for your response! Sorry I had to sink so low to get your attention ☺.

 

If there is one thing I'd love to see you speak to (or point us to), is the evidence for my "totally mistaken ... interpretation of the Adventists-vs.-Okies issue". I realize they are very different cultures, with very different access to medical treatment etc, and that cross-cultural epidemiological studies are notoriously difficult, to say nothing of trying to compare different studies covering different cultures. But if there is more to it than that (which you seem to imply), I know we'd all love to hear at least a hint as to what it is. Please don't let it become a statement like Fermat's Last Theorem. ☺

 

On your other observations:

 

Absolute vs. Net Calorie shortfall: I can agree that there is some degree of calorie shortfall required for benefits, but I think the evidence is pretty suggestive that the extra benefits of increasing degrees of CR drop off more precipitously than the Weindruch graph would suggest, and that "absolute" CR (i.e. calories, calories, calories) is a much squirrelier & incoherent concept than most believe, and therefore all CR effects are actually net CR effects, to one degree or another - since any concept of baseline calorie intake will vary dramatically based on circumstances. If the CR response is going to be as robust as you say it is, it seems to me it has got to be operative across a wide range of environmental conditions, and therefore a wide range of baseline calorie intake.  But I'd certainly be extremely interested in an argument to the contrary, if you can ever find time to make one.

 

Shift From Reproduction: I think we're all on the same page that CR definitely downregulates certain aspects of sexuality and reproductive functions. My only contention was that Tom's statement "...you need to reach a specific level (temporary loss of reproductive capability) to trigger the [CR] effect" was much too black and white, and may conflate cause and effect. A certain degree of resource reallocation away from reproduction is one of the manifestations of the CR effect, but not the only one, and certainly not all-or-nothing, at least in men. Rereading it, it seems more like either circular logic, a tautology or reverse causality than a meaningful, coherent statement. What Tom seems to be really saying with that statement (in the context of what he says elsewhere in that post) is:

 

"You need to reach a specific level of [calorie shortfall which results in] temporary loss of reproductive capability to trigger the [CR] effect [which manifests as various metabolic changes, but most importantly in Tom's mind as a temporary loss of reproductive capability]."

 

See what I mean by circularity / tautology?

 

In my book, loss of sexual or reproductive drive and/or capacity is just one (unfortunate, most would say) side effect of CR. Granted, it may be a pretty consistent marker of the CR state, but it's not clear to me how casual it is wrt CR benefits. CR changes lots of things about metabolism, many of which I consider more important for longevity than reduced reproductive capability / sex hormone level (e.g. lower systemic inflammation, insulin and IGF-1). By analogy - red skin is a good marker for a sunburn, but red skin is a side effect of a sunburn and not causally responsible for a sunburn's harm - the real culprit is UV radiation damage to chromosomes.

 

Thanks again for contributing Michael - your thoughts are always appreciated!

 

--Dean

Share this post


Link to post
Share on other sites

Meta-Analysis Shows Increased Dietary Fiber Reduces All-Cause Mortality

 

Back to the topic of this thread - fiber and the hunger hypothesis.

 

Al Pater posted this 2015 meta-analysis [1] which combined data from 17 prospective cohort studies involving nearly a million people to see how fiber intake and mortality are correlated.

 

Averaged across the 17 studies, they found a 10% reduction in all-cause mortality for each 10g/day increase in dietary fiber. Here is the needle plot of the relative risk of mortality between being in the top third vs. bottom third of fiber intake across the various studies included in the meta-analysis:

 

8OEz2Xp.png

 

As you can see, the benefits were quite consistent, despite the fact that the studies controlled for a range of confounders. Interestingly, in the studies that controlled for cholesterol, the benefit of fiber was much lower, suggesting that fiber has at least part of it's benefit through reducing serum cholesterol levels, which shouldn't surprise anyone.

 

This meta-analysis didn't report the actual fiber amounts people were consuming in the studies it combined, but I think we can safely assume they were all in the 10-40g/day range. A far cry from the amount of fiber many of us eat, so the relevance of this study for those of us already eating a super-healthy diet with lots of fruits and vegetables, whole grains and legumes is not clear.

 

Moreover, this sort of meta-analysis of fiber and mortality risk in the general population certainly doesn't answer the question of whether there is such a thing as too high a fiber intake, to say nothing of whether such a hypothetical upper healthy limit has anything to do with fiber's tendency to reduce hunger, i.e. the Hunger Hypothesis.

 

But it seems pretty obvious from this and other studies cited above that "more is better" when it comes to fiber intake for the general population eating a low-to-moderate fiber diet right now.

 

--Dean

 

--------

[1] Am J Epidemiol. 2015 Jan 15;181(2):83-91. doi: 10.1093/aje/kwu257. Epub 2014 Dec 

31.
 
Association between dietary fiber and lower risk of all-cause mortality: a
meta-analysis of cohort studies.
 
Yang Y, Zhao LG, Wu QJ, Ma X, Xiang YB.
 
Although in vitro and in vivo experiments have suggested that dietary fiber might
have beneficial effects on health, results on the association between fiber
intake and all-cause mortality in epidemiologic studies have been inconsistent.
Therefore, we conducted a meta-analysis of prospective cohort studies to
quantitatively assess this association. Pertinent studies were identified by
searching articles in PubMed and Web of Knowledge through May 2014 and reviewing 
the reference lists of the retrieved articles. Study-specific risk estimates were
combined using random-effects models. Seventeen prospective studies (1997-2014)
that had a total of 67,260 deaths and 982,411 cohort members were included. When 
comparing persons with dietary fiber intakes in the top tertile with persons
whose intakes were in the bottom tertile, we found a statistically significant
inverse association between fiber intake and all-cause mortality, with an overall
relative risk of 0.84 (95% confidence interval: 0.80, 0.87; I(2) = 41.2%). There 
was a 10% reduction in risk for per each 10-g/day increase in fiber intake
(relative risk = 0.90; 95% confidence interval: 0.86, 0.94; I(2) = 77.2%). The
combined estimate was robust across subgroup and sensitivity analyses. No
publication bias was detected. A higher dietary fiber intake was associated with 
a reduced risk of death. These findings suggest that fiber intake may offer a
potential public health benefit in reducing all-cause mortality.
 
© The Author 2014. Published by Oxford University Press on behalf of the Johns
Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, 
please e-mail: journals.permissions@oup.com.
 
PMID: 25552267

Share this post


Link to post
Share on other sites

A few brief things:

Tom was, at least in the context of the CR Society, the originator of the HH: way back on March 07, 2002, a poster using a self-professed anagramic alias of the person we know as TomB posted with the subject line "Hunger is NECESSARY for CRON benefits!" on the "SENSATION of hunger [being] integral to achieving the benefits of CR."

Saul is right about Miller's comment, but it was a one-off thing: it was Speakman who has explored the question in research.
 

if we know anything, we know that a high fiber, high volume, low-GI diet is a great way to reduce the hunger that accompanies CR, and that some say, may be required for CR to be beneficial - the so-called "Hunger Hypothesis".


Fiber does certainly help to manage hunger, but it (and our other tricks) is far from a truly effective appetite suppressant. And while I won't say dogmatically that no one not experiencing some level of hunger couldn't possibly be on CR, I can tell you that I and many other CR people experience both occasional acute bouts of hunger, as well as a harder-to-define continuous background awareness of chronic energy deficit and vigilance that seems subjectively to me to be unique to the CR state, despite consuming quite enormous amounts of fiber and low-carb vegetables. Certainly, trials involving feeding people higher-fiber diets lead only to very modest and short-term weight loss, nothing equivalent to 20% (let alone 40%) CR from an already-lean state, as in the studies of CR proper.
 
I also don't think that the fiber epidemiology is particularly relevant here, particularly since its main putative mediator in this context (more fiber → less hunger → less food intake) is accounted for by adjustment for BMI in any well-done prospective study.
 

Michael discusses the HH in his comprehensive SENS blog post on the Primate CR studies - suggesting it might be an explanation for the disappointing monkey results based on the fact that over the years the monkeys in the NIA's CR group appeared to become less motivated by food [3], suggesting they weren't experiencing much hunger.
 
He suggests neuropeptide Y (NPY) or ghrelin as two potential candidate signalling molecules associated with hunger that might mediate the HH effect on longevity. He focuses a lot on NPY, since it seems to be elevated both by acute fasting and at least by several months of chronic CR - which makes it unusual among hormones and neuropeptides involved in energy homeostasis, which generally tend to return to baseline after a few week or months of chronic energy restriction.
 
But the evidence he provides in that blog post to support the involvement of NPY (or ghrelin for that matter) in the longevity benefits of CR seems to me to be pretty scant. ...

The evidence he provide to suggest a direct link between hunger (and esp, elevated NPY) and longevity seems similarly weak and tenuous. He cites [5] which found reducing NPY via lesion or genetic mutation prevents CR from protecting mice against skin cancer. He also cites [6], a study of a drug that, among several effects relating to serotonin, may possibly (Michael's word) block the effect of NPY. Rats given the drug ate 10% less food when fed ad lib than rats not given the drug, but didn't live any longer (except for the male rats on a medium dose, who did live longer). As I said, pretty tenuous evidence for a link between NPY and longevity if you ask me.


First, I think that's actually rather substantial evidence, particularly since the skin carcinogenesis model is so bloody counterintuitive. Imagine! You knock out a gene required for NPY — a central mediator of energy sensing — or induce lesions in the relevant center of the brain, and it obviates the effect of CR in protecting the animal against chemical carcinogens! That's a pretty bloody striking result. So is the finding that a drug that causes an animal to eat less food and lose weight can lead to no effect across the entire stretch of a survival curve, despite the multiple cases of "crypto-CR" in the literature (generally caused by the food just tasting awful when heavily dosed with some drug).
 
I would agree, however, that it's far from definitive: but two things here. First, you're ignoring one important wider thing about the apparent lack of hunger on the part of the NIA "CR" primates (more on this below), which is that it implies that they probably aren't on CR at all: a CR animal is a hungry animal, whether or not hunger as such or NPY signaling as its mediator is mechanistically involved in the life-extending effects of CR.
 
Second: since I wrote that monstrous blog post, significant amounts of further evidence in support of the NPY-based "Hunger Hypothesis" has been published:
 

Sci Rep. 2014 Mar 31;4:4517. doi: 10.1038/srep04517.
A key role for neuropeptide Y in lifespan extension and cancer suppression via dietary restriction.
Takuya Chiba, Yukari Tamashiro, Daeui Park, Tatsuya Kusudo, Ryoko Fujie, Toshimitsu Komatsu, Sang Eun Kim, Seongjoon Park, Hiroko Hayashi, Ryoichi Mori, Hitoshi Yamashita, Hae Young Chung & Isao Shimokawa

... In this study, we found that neuropeptide Y (Npy), which mediates physiological adaptations to energy deficits, is an essential link between DR and longevity in mice. The lifespan-prolonging effect of lifelong 30% DR was attenuated in Npy-null mice ... In male WT mice, DR extended lifespan by 20.3% and 14.8% at the 50th and 25th percentile survival points; in male Npy−/− mice, these were −1.9% and 7.1%, respectively. In female WT mice, DR increased lifespan by 36.0% and 33.6% at the 50th and 25th percentile survival points; in female Npy−/− mice, DR increased lifespan by −1.0% and 19.5%. Female Npy/-DR mice received 10% more food compared to female WT-DR mice [because their corresponding AL knockout mice ate more], this might cause the diminution of life-extending effect of DR, when compared to the extent of lifespan extension by DR in WT mice. However, daily allotments for male Npy/-DR mice were 5% less than those for male WT-DR mice [for the same reason]; nonetheless, the life-extending effect was diminished in Npy/-DR mice.
 
 
srep04517-f3.jpg
 
 
[NPY knockout also attenuated the effect of lifelong 30% DR] on the occurrence of spontaneous tumors and oxidative stress responses in comparison to wild-type mice.
 
In contrast, the physiological processes activated during adaptation to DR, including inhibition of anabolic signaling molecules (insulin and insulin-like growth factor-1), modulation of adipokine and corticosterone levels, and preferential fatty acid oxidation, were unaffected by the absence of Npy.
 
These results suggest a key role for Npy in mediating the effects of DR. We also provide evidence that most of the physiological adaptations to DR could be achieved in mice without Npy.

PMID: 24682105


See also the review "Hungry for life: How the arcuate nucleus and neuropeptide Y may play a critical role in mediating the benefits of calorie restriction," roughly contemporaneous with the original CR-NPY-skin cancer study, on some interesting mechanistic mediators; the later review "Neuropeptide Y: An Anti-Aging Player?" (PMID 26549884) (If this pdf is incomplete, drop me an email or PM: I have a complete copy); and PMID 25775546, "Neuropeptide Y stimulates autophagy in hypothalamic neurons."
 

He suggests the lack of a drop blood pressure in the CR monkeys is suggestive of a low NPY level, since both CR and elevated NPY are usually accompanied by a drop in blood pressure. But there are lots of things affect BP besides NPY, so his reasoning seems like a pretty big stretch. And even if it were a lack of elevated NPY that explained why the CR monkey's BP didn't drop, that still doesn't say anything (directly at least) about whether elevated NPY (a surrogate for hunger) has anything to do with the lifespan effects of CR. Although high BP is the world's #1 cause of early preventable death, ahead of tobacco and alcohol use [2], I don't think anyone (esp. Michael) would claim that you can gain CR lifespan benefits simply by reducing your BP, e.g. through sodium restriction or blood pressure medication. So if NPY is going to affect longevity, it probably isn't through its BP-lowering effect.


I think you missed the thrust of my argument here. I certainly wasn't arguing that BP lowering or its lack is indispensible to the effect of CR, as you seem to have taken away. Certainly, no one would think that a lack of BP lowering would block an anti-cancer effect, as was seen in the NPY knockout mice above, or in possible parallel in the NIA "CR" monkeys. You ignored/neglected the context in which the BP comment was made, which was in the sentence immediately following:
 

A failure to elicit an elevation in NPY in these animals would also be consistent with the relatively modest effects of CR at NIA on hormones that are strongly regulated by NPY and hypothalamic energy sensing, such as reproductive hormones(162-166) and T3.(167-168) Similarly, there is evidence that elevated NPY is central to CR-induced reductions in blood pressure,(169-171) and whereas the reductions in BP were profound in the CR Society cohort,(93) they were modest and inconsistent in the NIA CR primates(158). [emphasis secondary]

 
Rather, as should be clear from the above, I was citing the lack of BP lowering as additional and more specific evidence of a lack of NPY elevation in the NIA "CR" monkeys, granted the more impressive evidence of a lack of hunger in these animals:
 

a food-retrieval study(147) found that the CR monkeys in the NIA study were no more motivated than age-matched controls to retrieve food, suggesting that they were not experiencing significant hunger, despite being ostensibly more restricted than the mice in the previously-described report. The apparent lack of hunger in the NIA CR primates is consistent with both the evidence that the rations provided to the AL animals were in excess of their appetites, and especially with the gradual convergence of energy intakes between the two groups over time.(147) [emphasis secondary]

Share this post


Link to post
Share on other sites

Michael,

 

Thanks for your response on this hunger hypothesis (HH) topic. I can see now why you think neuropeptide Y (NPY) is both an important marker of hunger and also potentially important for CR lifespan benefits. Your study [1] of longevity in NPY-knockout mice is quite interesting, and is one that I've not looked at (at least in any detail) before. Thanks for pointing it out.

 

I certainly understand and appreciate your perspective. A cursory reading of [1] does seem to suggest knocking out NPY puts the kibosh on CR benefits. And since I know you are very busy Michael, I can understand why a cursory reading may have been all you had time for, although it appears that much of your analysis of [1] was done many years ago, as part of your mega-post on the NIA monkey CR study. Perhaps you were very busy then too...

 

Below I'll try to point out aspects of [1] you seem to have overlooked or chosen to ignore, but which nonetheless make it quite irrelevant as evidence in favor of even the idea that NPY is important for CR benefits, to say nothing of evidence supporting the hunger hypothesis. And don't even get me started on how irrelevant this study is regarding the health benefits (or detriments) of a very high fiber diet...

 

First off - here is an easy one. This was a tiny study, with only 12 mice in each group - a fact that by itself reduces this study's credibility, but which will become more important below...

 

But here is the real kicker - the most obvious flaw this study has as evidence to support the hunger hypothesis. It's something so obvious in fact that I'm pretty surprised you didn't pick up on it. Let's look at the food consumption data, shall we?:

 

dJt34DK.png

Notice anything strange about these graphs? Hint: you'd expect mice who aren't hungry to eat less when given free access to food. That's right, when fed ad lib, NPY knockout (NPY-/-) mice of both genders (red curves) ate just as much as the wild-type (WT) mice. In fact, the NPY-/- females ate slightly more food than the wild-type females.

 

If they weren't hungry due to their lacking NPY, why did they eat just as much or even more food that ad lib normal mice? Obviously we can't ask them directly about their subjective hunger level, but can only infer it from their food intake (although see below for more about this), which suggests they weren't any less hungry. And it seems reasonable to extrapolate this similarity in ad lib food consumption (and therefore matching hunger) to the CR mice as well. In other words, if the NPY-/- knockout mice ate as much as the NPY-replete mice, and so by inference weren't any less hungry when both groups were fed ad lib, what makes you think the NPY knockouts were any less hungry than wild type mice when both were fed 30% less than ad lib?

 

So much for this study saying anything about the hunger hypothesis...

 

But believe it or not, it gets worse, much worse, at least as far as the HH is concerned.

 

Consider the following. Despite eating as much, the NPY-/-​ CR males weighed less than the WT CR males (purple vs. green line below), while no such weight difference was apparent in the ad lib fed males (red vs blue line below):

 

0ZKc5f3.png

The authors recognize that eliminating NPY signaling has subtle and unknown endocrine effects that could mess up the mice metabolism and potentially explain this weight difference under CR:

 

Genetic disruption of Npy signaling, however, exerts subtle effects on feeding and weight gain in young mice [refs], most likely due to compensatory changes in the neuroendocrine system that normalizes feeding and energy expenditure in the absence of Npy.

In other words, eliminating NPY probably results in changes to other unidentified endocrine hormones that alter the eating behavior and metabolism of NPY-/-​ mice. So we've already got two anomalies undermining the relevance/credibility this study - NPY-/-​ mice don't appear to be any less hungry, and the male NPY-/-​ had unexplained extra weight loss on CR despite eating as much food as the WT CR mice, which was likely due to changes in other unknown hormones.

 

This anomalous weight loss in the NPY-/-​ males subjected to CR wouldn't be so significant, if it weren't for the fact that fully 50% of the CR male NPY-/-​ mice died within the first year of the study, as clearly depicted in the survival curves you posted above, which I've reproduced below for easy reference. Notice the huge drop in the purple line in the left graph between 30 and 48 weeks, shortly after onset of CR, showing half the CR NPY-/-​ males kicking off quite early in the study:

 

zKA94Ph.png

In short, half the CR male NPY-/-​ mice were dead before the human equivalent of age 40 - right around the time most of us started CR. The authors obviously noticed this early mortality in CR NPY-/-​ males, but had no explanation for it:

 

Five of twelve male NPY-/--DR mice died before reaching the age of 52 weeks of life; in contrast, only a few WT mice died during this period (Fig. 3A). Post-mortem examination found no specific causes of death in these males.

Honestly Michael, this is starting to look like a study of genetically f*cked up mice that you're always railing against when CR doesn't work - e.g. your dismissal of the Nelson study [2] showing CR fails to extend lifespan in many different strains of mice because, in your words (my emphasis):

 

The generalizability of the high level and opposing directions of response to CR in this study [2] is rendered unlikely by the inclusion of the DBA/2 strain as one of 8 inbred mouse strains contributing to the recombinant crosses used in this study. DBA/2 is an extremely fragile, short-lived, and disease-prone strain, [...] which seems from other research to be uniquely inflexible in metabolically adapting to CR,([refs]), which was seeded across the spectrum of these strains, rendering parlous any extrapolation of the results of this study...

Them's some mighty big words (Parlous - "full of danger; precarious". Now that's a new one for my vocabulary!). I'll translate Michael's techno-speak - "Don't trust studies in which CR fails because they are using mice that are genetically f*cked up and therefore fragile and unable to handle the rigors of CR".

 

I'd say that a 50% mortality rate before age 40 when subjected to CR qualifies these NPY-/-​ mice as f*cked up and fragile - wouldn't you?!

 

As you know well Michael, completely knocking out important hormones (as opposed to simply reducing their level) can be extremely detrimental, even when we know with near certainty that the hormone in question ultimately hastens the aging process. IGF-1 is the poster child for "double-edged sword" effect. Completely knock out IGF-1 results in mice that are as dead as doornails. From PMID 19760669:

 

Unfortunately, IGF-1 knockout mice have severe developmental abnormalities and most do not survive, making it difficult to study how genetic ablation of IGF-1 affects colon tumorigenesis.

But as we well know, reducing IGF-1 increases longevity and is thought to be an important cause of CR benefits. In short, the complete absence of IGF-1 is lethal, but keeping it relatively low is life-extending. So what makes you think it's reasonable to extrapolate longevity results from mice completely lacking NPY to a treatment or behavior (e.g. eating lots of fiber) that may modestly modulate NPY around the neighborhood of its normal level? Seems to me like quite a stretch - stretching credulity past the breaking point...

 

But getting back to the specifics of [1]. So the authors were faced with an unfortunate predicament. Half of one of their groups of mice had died from mysterious causes very early in life in what is supposed to be a lifespan study. To make matters worse, they only had 12 mice in each group to begin with. So they did what any good researchers would do in the situation, the soldiered on by throwing out the anomalous early deaths and reanalyze the data:

 

To eliminate bias resulting from these early deaths [of male NPY-/-​ mice], we reanalyzed the lifespan data by censoring the deaths of these mice... In WT mice, the DR group lived significantly longer than the AL group [p < 0.0001 [Diet: RR = 0.240 (0.126 ~ 0.440)]. In NPY-/- mice, DR mice also lived longer than AL mice (p = 0.0167 [Diet; RR = 0.329 (0.119 ~ 0.821)]. However, the RR seemed to be greater in Npy−/− mice than in WT mice.

Notice they say "the [relative risk] seemed to be greater in NPY-/-​ mice than in WT mice"? If you look at the 95% confidence interval, the range of possible CR longevity benefit is clearly wider for the NPY-/-​ mice on account of the fact that there were only 6 or 7 of them left to analyze, so the uncertainty is greater. But at the same time, the confidence interval of lifespan benefit resulting from CR in NPY-/- mice completely spans that of the CR WT mice. In short, once they threw out the early mortality males, the authors didn't observe a statistically significant better result of CR in the WT mice than in the NPY-/- mice.

 

So unlike the survival graphs Michael posted that would appear to show no benefit of CR in NPY-/- mice, CR did extend the lifespan of NPY-/- mice, once the early males deaths were eliminated - and in fact the NPY knockout mice benefited to a degree that was statistically indistinguishable from the benefit that wild-type mice received from CR. This would seem to seriously undermine the idea that this study shows NPY to be critical for triggering the CR response, to say nothing of its significance for the hunger hypothesis.

 

But incredibly, that's not even the end of the reasons why this study is irrelevant.

 

You'd think that in a study like this one, comparing lifespan in wild type mice vs. mice lacking a single gene (in this case NPY), that the researchers would make sure the other genes were as similar as possible between the two groups, in order to be sure that any difference in lifespan could be attributed to the difference in the single gene (NPY) - right?

 

Nope, the presence or absence of the NPY gene wasn't the only genetic difference between the WT and NPY-/-​ mice in this study. In fact, the WT and NPY-/-​ mice were from two entirely different strains, which the authors recognize as a shortcoming of their study:

 

A limitation of this study is the fact that the genetic backgrounds of the Npy−/− (Npytm1Rpa/J, approximate to 129S1/SvImJ) and WT (129S6/SvEvTac) mice differed.

The authors use weasley words to argue that these other genetic differences probably didn't matter. But it seems to me that it could be argued that they were comparing the longevity of apples vs. oranges, and that the NPY status of the groups may have had little if anything to do with the observed lifespan differences. Perhaps it was those other genetic differences (besides NPY) which made the NPY-/-​ mice more fragile when subjected to CR, explaining the early male mortality and only the modest benefit from CR. I'm not saying these other, unaccounted for genetic differences necessary are the explanation for the (modest and non-significant) longevity differences observed between the WT and NPY-/-​ mice. I'm just saying it's a possibility, and that this possibility further undermines the credibility and relevance of this study for the question at hand.

 

Finally, if all that wasn't enough to fully undermine the relevance of this study, CR was commenced abruptly at 12 weeks of age (the human equivalent of ~20 years old). Abrupt onset CR, particularly in adulthood, has been shown to be traumatic and often counterproductive for longevity depending on the strain. This abrupt, adult-onset initiation of CR may very well have contributed to the early mortality observed in the fragile NPY-/-​ mice. Plus, starting CR at the age of 20 is a lot earlier than any of us humans did. That wouldn't be so bad, except for a fact that I highlighted in the first quote I made from this study at the top of this post, namely:

 

Genetic disruption of Npy signaling, however, exerts subtle effects on feeding and weight gain in young mice [ref, 3], most likely due to compensatory changes in the neuroendocrine system that normalizes feeding and energy expenditure in the absence of Npy.

Notice the authors specifically call out how young mice are particularly susceptible to having their neuroendocrine system (and metabolism in general) messed up by the disruption of NPY signalling? This would seem to further undermine the relevance of this study for human CR practitioners, since most of us started CR at around twice the age of these 20-year-old equivalent mice.

 

The authors didn't say what exactly they meant by "young mice", and only one of the two references they give in support of this assertion was available in full text form [3]. Study [3] tested the food anticipatory activity in response to CR in NPY knockout mice starting at 9-10 weeks of age, pretty close to the 12 weeks of age at which [1] was started. I'm not exactly sure what the point about "young mice" the authors of [1] were trying to make with the above statement, but it turns out that [3] is a goldmine, not just for dismissing NPY as relevant to the hunger hypothesis, but doing so for a bunch of other so-called 'hunger hormones' as well.

 

The title of [3] pretty such sums it up: Single gene deletions of orexin, leptin, neuropeptide Y, and ghrelin do not appreciably alter food anticipatory activity in mice.

 

In other words, knocking out any of the major so-called "hunger hormones", including NPY, has no effect on the apparent subjective hunger of mice, as measured by "food anticipatory activity" (FFA) - i.e. the tendency of mice to get agitated, chew on the cage, check the food bin, etc for several hours prior to mealtime when subjected to CR.

 

For NPY in particular, [3] found that both the weight, weight gain and the food intake of NPY-/-​ mice were virtually identical to that of WT mice when fed ad lib. Once again, as in [1], we see no sign that NPY-/-​ mice were any less hungry than normal mice based on how much they ate when given free access to food. This is confirmed by the fact that when calorie restricted, the NPY-/-​ mice showed no less food anticipatory activity than normal mice - i.e. they got just as agitated as the wild-type mice in the time leading up to dinner. The authors of [3] conclude:

 

Overall, our results do not support critical roles for any of the genes studied—leptin, orexin, NPY, and ghrelin—acting individually in mediating FFA [food anticipatory activity] by CR...

So the food intake data from [1] and [3], as well as the behavioral markers of subjective hunger in [3], call into question the very idea that mice lacking NPY are any less hungry than normal mice. And the same goes for all the other hunger hormones including leptin, orexin and ghrelin.

 

In conclusion, knockout of NPY (or any of the other 'hunger hormones') doesn't appear to make mice any less hungry than normal mice. Further, NPY knockout mice have a level of this important hormone which is not only physiologically irrelevant (i.e. zero) to normal metabolism, but also which likely makes them fragile and therefore less able to withstand the rigors of CR, especially when CR is initiated abruptly in adulthood like it was in [1]. This fragility likely explains both the early mortality and the (non-significant) reduction in longevity benefits from CR in the NPY knockout mice in [1]. This seems to me to make [1] exactly the kind of study of fragile, genetically f*cked up mice that you Michael regularly say we should ignore as irrelevant when assessing the effectiveness of CR.

 

So tell me again Michael - why do you think [1] is at all relevant evidence in support of the idea that NPY plays an important causal role in eliciting the CR response, and even harder to fathom, why it might be relevant evidence in support of the hunger hypothesis?

 

Finally, [3] seems to cast serious doubt on the whole idea that subjective hunger is correlated with the level of any of the so-called 'hunger hormones' (including NPY), which HH advocates point to as the likely causal link between subjective hunger and CR longevity benefits.

 

So much for the hunger hypothesis. Go fiber!

 

--Dean

 

-------------

[1] Sci Rep. 2014 Mar 31;4:4517. doi: 10.1038/srep04517.

 

A key role for neuropeptide Y in lifespan extension and cancer suppression via dietary restriction.

 

Takuya Chiba, Yukari Tamashiro, Daeui Park, Tatsuya Kusudo, Ryoko Fujie, Toshimitsu Komatsu, Sang Eun Kim, Seongjoon Park, Hiroko Hayashi, Ryoichi Mori, Hitoshi Yamashita, Hae Young Chung & Isao Shimokawa

 

PMID: 24682105

 

----------------

[2] Aging Cell. 2010 Feb;9(1):92-5. doi: 10.1111/j.1474-9726.2009.00533.x. Epub 2009

Oct 30.

Genetic variation in the murine lifespan response to dietary restriction: from

life extension to life shortening.

Liao CY(1), Rikke BA, Johnson TE, Diaz V, Nelson JF.

PMCID: PMC3476836

PMID: 19878144

 

----------

[3] PLoS One. 2011 Mar 28;6(3):e18377. doi: 10.1371/journal.pone.0018377.

 

Single gene deletions of orexin, leptin, neuropeptide Y, and ghrelin do not

appreciably alter food anticipatory activity in mice.

 

Gunapala KM(1), Gallardo CM, Hsu CT, Steele AD.

 

Author information:

(1)Broad Fellows Program in Brain Circuitry, Division of Biology, California

Institute of Technology, Pasadena, California, United States of America.

 

Timing activity to match resource availability is a widely conserved ability in

nature. Scheduled feeding of a limited amount of food induces increased activity

prior to feeding time in animals as diverse as fish and rodents. Typically, food

anticipatory activity (FAA) involves temporally restricting unlimited food access

(RF) to several hours in the middle of the light cycle, which is a time of day

when rodents are not normally active. We compared this model to calorie

restriction (CR), giving the mice 60% of their normal daily calorie intake at the

same time each day. Measurement of body temperature and home cage behaviors

suggests that the RF and CR models are very similar but CR has the advantage of a

clearly defined food intake and more stable mean body temperature. Using the CR

model, we then attempted to verify the published result that orexin deletion

diminishes food anticipatory activity (FAA) but observed little to no diminution

in the response to CR and, surprisingly, that orexin KO mice are refractory to

body weight loss on a CR diet. Next we tested the orexigenic neuropeptide Y (NPY)

and ghrelin and the anorexigenic hormone, leptin, using mouse mutants. NPY

deletion did not alter the behavior or physiological response to CR. Leptin

deletion impaired FAA in terms of some activity measures, such as walking and

rearing, but did not substantially diminish hanging behavior preceding feeding

time, suggesting that leptin knockout mice do anticipate daily meal time but do

not manifest the full spectrum of activities that typify FAA. Ghrelin knockout

mice do not have impaired FAA on a CR diet. Collectively, these results suggest

that the individual hormones and neuropepetides tested do not regulate FAA by

acting individually but this does not rule out the possibility of their concerted

action in mediating FAA.

 

PMCID: PMC3065493

PMID: 21464907

Share this post


Link to post
Share on other sites

For anyone who, despite my post above, may still be doggedly clinging to the "Hunger Hypothesis" (i.e. CR benefits require being hungry, likely as a result of elevated neuropeptide Y - NPY), have I got a sweet new study for you! 

 

Read this post over on the sweetener thread about a new study (PMID 27411010) titled:

 

Sucralose Promotes Food Intake through NPY and a Neuronal Fasting Response

 

and judge for yourself its implications for the hunger hypothesis. I'm still scratching my head over it...

 

--Dean

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×