Jump to content

basic food tracking question: raw vs cooked entries


kpfleger

Recommended Posts

Seemingly basic but practical issue that comes up immediately upon trying to do basic tracking of food intake: How do you deal with the fact that measuring is easier on foods in their raw state, but nutrition info differs when cooked?
 

Nutrition databases such as the USDA's typically have entries for raw and cooked versions of foods, and they often have nutritional profiles that differ significantly in at least some important nutrients. But DBs such as the USDA, and food tracking software such as CronOMeter or MyFitnessPal don't seem to provide entries for however much you get cooked when cooking such-and-such a measured amount raw, which would seem to me to be the natural thing that you want. (I could almost imagine that all the raw grain entries are supposed to be interpreted that way, but surely not the raw veggie entries because you also clearly want the amount when eaten raw.)

 
When cooking most grains, it makes much more sense to measure the grain dry and then provide nutrition info for however much that dry amount results in when cooked (in water say), rather than trying to weigh the cooked result where the amount of water weight could vary and thus skew results by a scale factor. Similarly, if you sautee 3-5 veggies into a stir-fry, you can't really measure them separately after cooking, so you have to measure raw. But then which database entry do you use for each, raw or cooked? And if you use the raw entry, how do you scale it to cooking---assume calorie count stays the same? For weight and water content surely the cooking method would be important.
 
How do you deal with sauteed or baked vegetables if the database provides only raw and boiled? Are most cooking methods close enough for things like macro and micro-nutrient content (even if water and weight issues are different)?
 
-Karl
 
Link to comment
Share on other sites

I also wonder how many blatant errors there are in the USDA data. You see some strange stuff when comparing some pretty basic foods raw vs. cooked versions to find examples where it matters which one you use (for important nutrients).

 

Boiled spinach has much less vitamin C. That one makes sense I think. Good enough reason to care whether you use raw vs. boiled entry?

 

But consider broccoli: The USDA data says that 100 calories raw has 1062ug of beta-carotene but that 100 calories 2654ug. How did boiling increase the concentration of beta-carotene per calorie by more than 2.5x? Choline concentration also more than doubles. PUFA, MUFA, and saturated fats also look odd. Perhaps these values are just so small for broccoli that measurement noise is large relative to these values.

 

In the case of carrots, USDA raw carrots have 0.11mg copper but boiled carrots have only 0.05mg per 100 calories. Selenium goes the other way, with boiled having 8x the concentration of raw (0.24ug selenium per 100 calories raw carrots vs. 2.00ug for 100 calories boiled carrots). Do the carrots pick up selenium from drinking water? So which entry should be used for baked or sauteed carrots? Selenium and copper are both nutrients where the range of intakes you want is pretty narrow (tolerable upper limit is not crazy high compared to adequate intake), so this seems like it might not be purely an academic question if trying to actually track to hit good intake ranges. (I should probably go check the normal daily levels for comparison. Later....)

 

 

Lastly, though not specifically a raw vs. cooked issue, more generally on the practical question of how do you make good choices when selecting entries in food logging software or nutrient databases (and the related issue of whether to trust the data), consider medium vs. long grain brown rice. USDA says long-grain has 2.4mg calcium per 100 calories (cooked or raw) vs. ~9mg calcium for 100 calories of medium-grain. I wouldn't have guessed there would be a factor of 3-4x difference. Should we chalk this up to just the fact that these are pretty small absolute amounts of calcium vs. daily levels and so probably the absolute difference isn't important?

Link to comment
Share on other sites

For the nutrients that increase when cooked perhaps what is being quantized is the amount of the nutrient absorbed?  While cooking can destroy nutrients it can also enhance digestibility in some cases.

 

I wonder how useful this sort of nutritional tracking actually is?  There is an awful lot of variability that is ignored.  We've discussed how wildly olive oils can vary in numerous properties based on the variety of olive, where and how they are grown, how the oil is processed and stored and on how fresh it is.  Why wouldn't other foods, say tomatos and their nutritional profile also vary with all of these factors?  And then when one takes into account the interactions between foods and cooking methods and the differences in our gut microbiomes and our genetics it seems likely that one could be hitting a target dead on at 100% and be deficient or perhaps even over dosed. 

Link to comment
Share on other sites

Todd, it sure is like you say. Sometimes we may even perceive a variation in sugars in one single fruit picked from the same tree as the other pieces of fruit. Variability is a law of nature.  One software like cronometer might give a false sense of assurance, things are surely not so precise as the software would suggest.

 

One way out of this predicament might be a probabilistic version, where the database of macros and micros includes the data uncertainty in a codified fashion (probability distribution) and the sums would be done not on the single numbers (deterministically) but on the single distributions (probabilistically), with the consequent probabilistic output.

 

In a few words, our caloric intake would no more be a single number, but a set of numbers, represented by a statistical distribution (like the normal distribution for instance) or by a useful and immediate interval representation.

 

My daily caloric intake might be a 90% interval which rules out the bilateral extremes and tells me that today I ingested from 2300 to 2700 calories.

This woudl be the reality. The uncertainty cannot be ruled out from the analysis because it is inherent in the measurement error, in the variable content of food, in the way the content degradates with time from picking and cooking and individual absorption and so on.

 

At the end, with the current deterministic softwares,  individual instinct and trial and error should always mediate the too-precise output values. 

 

One of the simplest ways maybe is our weight scale. It gives to us a gross but irrefutably objective evidence of the results of our dietary intake in terms of energy balance. Gordo, as I understand it, is following this method, instinctively understanding the futility of all our precise measurements (which I myself keep doing).

 

Micros are harder to monitor,  precautionary measures like supplementation or moderate overingestion are common, plus blood testing, self-analysis and all other things the members of this forum are routinely undertaking.

 

Things like protein restriction=optimization are a chimera, an unreal objective if we reason this realistic way. Even though we might take 24-hours urine tests we still wouldn't know the real protein intake because of uncertainty. Our final calculation of the zero-nitrogen balance would result in a wide interval and consequently we would be compelled to fall back on trial and error to ascertain its real value (which would also vary with time and other conditions).

Link to comment
Share on other sites

Sometimes we may even perceive a variation in sugars in one single fruit picked from the same tree as the other pieces of fruit. Variability is a law of nature.

Yeah, everything is in flux all the time -- you, me, every fruit and vegetable chemical, cooked, raw, every legume and drop of precious olive oil -- and so we enter the poetic realm. I've always and only viewed Cronometer as a hint, a fluctuating guidepost, a dim light, not much better than simply holding a finger to the stars and guessing how constellations affect my biology: I'm a Pisces, what's ur sign? Cronometer is something rather than nothing, thank you for it, I paid and subscribed; but like trying to determine even some easy-tool body measurement like blood pressure, numbers are destined for nonsense. One moment it says x/x, the next moment it reads something new. Why bother even taking it since who knows. But then there's criticism oh that's too negative, oh a bad attitude, oh too narrow is the bleak view, and besides, I'm overlooking helpful indication ranges. Or pushing up the daisies of false hope.

 

A flowery hint, that's what we get, distant colors for the nearly blind. Poetry. More art than science. Take a car into a shop, they'll stick diagnostic tools into it, a computer reads some numbers, tells some technician some valuable hints, this wasn't possible ten years ago, and the rambling body human is so much more complex, and look: there's AI pop stories promising better future tools. "Don't hold your breath for AI to help us find better health," then we're told, and so we're grateful for whatever's here today that wasn't here yesterday. Like beggars.

 

Meanwhile, think about funding tiny efforts like http://www.lifespan.io/campaigns/cellage-targeting-senescent-cells-with-synthetic-biology/ "[D]esigning synthetic promoters for safe and precise targeting of dysfunctional “senescent” cells, with the aim of developing senolytic gene therapies to remove them..." if I can give $25 nearly anyone can.

Link to comment
Share on other sites

In posts about cronometer, everyone praises it, says they love it, says they paid, etc. In posts about fasting-mimicing diets people post their cronometer analyses. In posts about micronutrient sufficiency/overdose people say to track with cronometer. In a post about a basic practical question about cronometer use (raw or cooked entry, and measure before or after cooking), here everyone ignores the question and waxes poetic about how futile its use is due to huge errors!

 

I guess I shouldn't have made the 2nd post about diving into the database. :-)

Link to comment
Share on other sites

kpfleger,  I don't know if you have read Frank Alvarez' answer to a similar question posed in the cronometer forum:

 

 

 

No easy answer here. The database from the USDA we use has this statement in regards to cooked and raw foods:

“Nutrient retention factors are based on data from USDA research contracts, research reported in the literature, and USDA publications. Most retention factors were calculated by the True Retention Method (%TR) (Murphy et al., 1975). This method, as shown below, accounts for the loss or gain of moisture and the loss of nutrients due to heat or other food preparation methods:

%TR = (Nc*Gc) / (Nr*Gr) * 100
Where
TR = true retention
Nc = nutrient content per g of cooked food,
Gc = g of cooked food,
Nr = nutrient content per g of raw food, and
Gr = g of food before cooking.”

It is not an exact science. Some foods cooked unlock nutrients for us, some don’t. Cooking methods are all different and the data may not clarify how a food was prepared. Where and when the food was grown determine nutrient levels. Do you keep the water after you cook it? You see where I am going! Numerous variables.

I would do the best you can and estimate what you start with raw and what happens as you cook it. 100g may equal 80 grams after, I can’t give you a good answer. It is more than likely going to be a small difference in the grand scheme of things in my opinion as a nutritionists. We can only do so much with the data we have.

 

Remember the RDA is a minimum and the TUL (tolerable upper limit) is the max recommended. Aim to get the RDA and under the TUL. The gap usually is fairly large and overshooting the RDA to be sure you get that much actually in you, shouldn’t be hard.

Link to comment
Share on other sites

Thanks mccoy! That discussion started with exactly my same concerns. I echo the comment in the 4th reply there:

I am surprised that there haven’t been a lot more posts about this. As soon as I started with Cronometer I immediately had to grapple with these problems.

 

I will continue the discussion there.

 

But briefly, to follow up on my question as to whether some of the example differences I found are significant relative to the range from RDA/AI to TUL (tolerable upper limit): Selenium and copper are 2 of the micro-nutrients that have a reputation for having narrower ranges between adequate and upper intake amounts (8x range for selenium 55-400mcg, 11x for copper 0.9-10mg). So the differences I found for those in 100 calories of raw vs. cooked carrots, though large multiples are small absolute differences relative to these ranges and thus shouldn't matter since few people will be eating 500+ calories of carrots per day.

 

My take-home: Measure in whichever form (raw, cooked, etc.) is most convenient and use the entry in the app/database that corresponds to the form the food was in at the time you did the measurement.

Link to comment
Share on other sites

I'm glad you are asking this question because I have the same exact question (and I have had it for a long time, but just have never asked it).

 

I'd be very curious to hear how people deal with this issue practically speaking. I don't expect Cronometer to be 100% accurate. But certainly there have been some best practices that people can recommend.

 

The other issue is that I'm making a lot of recipes. It really is impossible for me to cook each ingredient separately and then measure the cooked amount and only mix them together afterwards.

 

Here is what I have been doing:

 

For the most part I measure raw ingredients, but I'm pretty inconsistent about whether I choose the raw or the cooked version of that food when I enter it into Cronometer. I use my gut depending on what it is and as I'm trying to feel my way towards some kind of logic. So far I haven't found it. I would say I choose the raw version more often because I am measuring raw, even if I end up tossing it into a pot of soup and cooking it.

 

As you can see, what I'm doing isn't so great. So, how should I do it better?

 

What do other people do?

Link to comment
Share on other sites

RE: inherent variability of foods. kpfleger mentioned selenium, which made me remind about an article on the variability of Se in brazil nuts. the variability is to say the least enormous. In such cases there is not much to do unless we have Se content in the package label (if it is reliable) or unless we know the provenance of the nuts themselves. What I do: the USDA amount on cronometer is 98 micrograms per nut. Since this value is located in an intermediate position in the graph, I may be happy with that, keeping in mind though that the real value might oscillate between about 10 and 220 micrograms per nut. By eating just 3 nuts,  I might be below the RDI or above the TUL. 

 

selenium-chart-1.jpg

 

The article is rather a technical note part of 'the call of the honeyguide' blog

Link to comment
Share on other sites

..

 

For the most part I measure raw ingredients, but I'm pretty inconsistent about whether I choose the raw or the cooked version of that food when I enter it into Cronometer. I use my gut depending on what it is and as I'm trying to feel my way towards some kind of logic. So far I haven't found it. I would say I choose the raw version more often because I am measuring raw, even if I end up tossing it into a pot of soup and cooking it.

 

As you can see, what I'm doing isn't so great. So, how should I do it better?

 

What do other people do?

 

Right now I do as you are doing, following what seems the most logical way given the uncertainties.

 

Another important issue: is that rigour really necessary? I find it is only in limited occasions, like for example when following Valter Longo's FMD, where it is pretty important not to go above a certain caloric threshold. In that case, I might choose the upper-bound solution, which yields more calories, and that's the most conservative in the specific case (that is, I'm pretty sure I do not go above the threshold even though I accept the probable fact that I eat less then allowed.

Link to comment
Share on other sites

The USDA gives a pretty good description of what happens to its nutrients when food is processed:

 

http://nutritiondata.self.com/topics/processing

 

This table says it is "typical maximum nutrient losses". "Typical maximum" seems like an oxymoron. What is being averaged over and what set of things are having their max taken? Different foods and different cooking times/done-ness?

 

It's interesting that this table distinguishes cook from cook+drain (as it should since many of the nutrients transfer to the water), but the text label of many of the USDA food entries just says "boiled" without the entry itself being clear on whether draining happened or not. (I also don't remember the few raw vs. cooked foods that I compared in a spreadsheet having nutrient reductions as big as this table suggests.)

 

Clearly there is a big practical user-interface issue here. This chart is interesting to know about but not terribly useful to users of Cronometer, MyFitnessPal, etc. If you use your own spreadsheets (or even harder download full USDA into a SQL database) for everything you eat (impractical) or for meal/menu/recipe planning (still hard and probably done by few) then you can maybe plug the numbers from this chart in to use the raw values adjusted for nutrient loss, but typical users are those that use the apps. I guess if one is pretty happy with accuracy only within a factor of 2, then one can use raw entries and comfort oneself by seeing that most of the %s in this table are less than 50%, and the ones that aren't are mostly nutrients that it is easy to get adequate supply of with a reasonable diet.

 

 

There are ~8000 different food entries in the USDA dataset, many of them pretty obscure stuff. I'm baffled why they didn't add entries for the nutrient content of however much of the cooked food results from measuring 100g pre-cooking. Surely for any food they provided a cooked entry for, it would have been no harder to provide this kind of entry.

 

I haven't studied all the different tables USDA provides. Maybe they have a table somewhere that says 100g of this food raw becomes this much cooked, but if so, its usefulness did not make it through to the apps like cronometer.

 

 

 

On the issue of nutrient variability, sure there is variation. I'm not sure how much practical variation there is for most foods. Brazil nuts may be an extreme case. Supposedly, USDA has or is adding some quantification of variability somehow, but again it matters little for app users until it gets incorporated into those systems. Few people are doing this with a spreadsheet or direct SQL database code.

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...