kpfleger Posted January 23, 2017 Report Share Posted January 23, 2017 Seemingly basic but practical issue that comes up immediately upon trying to do basic tracking of food intake: How do you deal with the fact that measuring is easier on foods in their raw state, but nutrition info differs when cooked? Nutrition databases such as the USDA's typically have entries for raw and cooked versions of foods, and they often have nutritional profiles that differ significantly in at least some important nutrients. But DBs such as the USDA, and food tracking software such as CronOMeter or MyFitnessPal don't seem to provide entries for however much you get cooked when cooking such-and-such a measured amount raw, which would seem to me to be the natural thing that you want. (I could almost imagine that all the raw grain entries are supposed to be interpreted that way, but surely not the raw veggie entries because you also clearly want the amount when eaten raw.) When cooking most grains, it makes much more sense to measure the grain dry and then provide nutrition info for however much that dry amount results in when cooked (in water say), rather than trying to weigh the cooked result where the amount of water weight could vary and thus skew results by a scale factor. Similarly, if you sautee 3-5 veggies into a stir-fry, you can't really measure them separately after cooking, so you have to measure raw. But then which database entry do you use for each, raw or cooked? And if you use the raw entry, how do you scale it to cooking---assume calorie count stays the same? For weight and water content surely the cooking method would be important. How do you deal with sauteed or baked vegetables if the database provides only raw and boiled? Are most cooking methods close enough for things like macro and micro-nutrient content (even if water and weight issues are different)? -Karl Link to comment Share on other sites More sharing options...
This topic is now archived and is closed to further replies.