Nonetheless, the media blurbs and even quotes from the scientists themselves suggest this study has a major case of mistaken identity. The lead researcher Frank Hu claimed the study “provides clear evidence that regular consumption of red meat, especially processed meat, contributes substantially to premature death,” despite the fact that the study is innately incapable of providing such evidence. It’s as if someone pulled a Campbell on us. Only an actual experiment, with controls and manipulated variables, could start confirming causation.
But the study’s over-extrapolation isn’t really that surprising. A conclusive experiment is what every observational study secretly yearns to be, deep down in its confounder-riddled, non-randomized heart. And like pushy stage mothers, some researchers want their observational studies to be more talented and remarkable than they truly are leading to the scientific equivalent of a four year old wobbling around in stilettos at a beauty pageant. Our study at hand is a perfectly decent piece of observational literature, but as soon as its authors (or the media) smear it with lipstick and make it sing Patsy Cline songs on stage, it’s all downhill from there.
Food Frequency Questionnaires: A Test of Superhuman Memory and Saint-like Honesty
To kick this analysis off, let’s take a look at how the study was actually conducted. As the researchers explain, all of the diet data came from a series of food frequency questionnaires (FFQs) that the study participants filled out once every four years, starting in the 1980s and ending in 2006. (If you’re feeling brave, you can read the questionnaire yourself (PDF) and try imagining how terribly the average, non-diet-conscious person might botch their responses.) The lifestyle and medical data came from additional questionnaires administered every two years.
The full text of our study offers some additional details (emphasis mine):
Notice that one of the foods listed under “unprocessed red meat” and likely a major contributor to that category is hamburger, the stuff fast-food dreams are made of. Although this study tracked whole grain intake, it didn’t track refined grain intake, so we know right away we can’t totally account for the white-flour buns wrapped around those burgers (or many of the other barely-qualifying-as-food components of a McDonald’s meal). And unless these cohorts were chock full of folks who deliberately sought out decent organic meat, it’s also worth noting that the unprocessed ground beef they were eating probably contained that delightful ammonia-treated pink slime that’s had conventional meat consumers in an uproar lately.
Next, we arrive at this little gem:
Ding ding, Important Thing alert! As anyone who’s spent much time on earth should know, expecting people to be honest about what they eat is like expecting one of those “Lose 10 pounds of belly fat” banners to take you somewhere other than popup-ad purgatory: the idealism is sweet and all, but reality has other plans.
And so it is with food frequency questionnaires. Ever since these questionnaires were first birthed unto the world, scientists have lamented their most glaring flaw: people tend to report what they think they should be eating instead of what actually goes into their mouth. And that’s on top of the fact that most folks can barely remember what they ate yesterday, much less what they’ve eaten over the past month or even the past year.
As a result, researchers compare the results of food frequency questionnaires with more accurate “diet records” where folks meticulously weigh and record everything they eat for a straight week or two to see how the data matches up. If we follow that last quote to the links it references, we end up at one of the validation reports for the food frequency questionnaire used with the Health Professionals Follow-up Study. Here’s where it gets interesting:
This shouldn’t come as a shocker if we consider human psychology. Unless we literally live in a cave, most of us are constantly inundated with messages about how high-fat dairy, meat, sweets, desserts, and anything delicious and creamy is going to either make us fat or give us a heart attack while it’s more like hallowed be thy name for fruits and veggies. Is it any wonder that folks tend to under-report their intake of “bad” foods and over-report their intake of the good ones? Who wants to admit in the terrifying permanency of a food questionnaire that yes, they do bury their salad in half a cup of Hidden Valley Ranch, and they do choose white bread because 12-Grain Oroweat tastes like lightly sweetened wood chippings, and sometimes they even go a full three days where their only vegetable is ketchup? If food frequency questionnaires were hooked up to a polygraph, we might see some much different data (and some mysteriously disappearing respondents).
Another reference in our study du jour takes us to a validation report for the Nurses’ Health Study questionnaire. And here we find the same trend:
Of course, if everyone over-reported or under-reported their food intake with the same magnitude of inaccuracy, we could still find some reliable associations between food questionnaires and health outcomes. But it turns out that how much someone fudges their food reporting especially for specific menu items varies wildly based on their personal characteristics. Using an Aussie-modified version of the Nurses’ Health Study questionnaire, a study from Australia measured how accurately people reported their food intake based on their gender, age, medical status, BMI, occupation, school-leaving age, and use of dietary supplements. Like with the other validation studies, it compared the results of the food frequency survey with the Almighty Weighed Food Record.
The surprising results? Folks with a “diagnosed medical condition” including high cholesterol, high triglycerides, diabetes, high blood pressure, stroke, cancer, and heart disease were much more likely to mis-report their meat consumption than folks without a diagnosed medical condition, generally overestimating their true intake on food frequency questionnaires compared to the weighed food record. Why this occurred is one of life’s great mysteries, but it might have something to do with the fact that people who develop diet- and lifestyle-related diseases pay less conscious attention to what they eat. (In this study, women were also more likely to inaccurately report their intake for a wide variety of foods a phenomenon that’s been examined in greater depth by other researchers.)
So what does this mean for studies based on food frequency questionnaires, like the one currently hijacking the news outlets? Unfortunately for lovers of scientific accuracy, it means that meat consumption and modern diseases might be statistically more likely to show up hand-in-hand by mere fluke. If sick folks have a tendency for whatever reason to say they’re eating more meat than they really are, that’ll have profound effects on any diet-disease associations that turn up in observational studies, where correlations hinge so heavily on the accuracy of the data. And if the results of that Australian study are applicable not only in the Land Down Under but also in the Land Up Over, it could mean that meat is pretty much doomed to look guilty by association with disease whenever food frequency questionnaires are involved. Woe is meat!
March 15, 2012
Copyright © 2012 Mark's Daily Apple