Is red meat killing us?

Ironically, as I’m writing up this piece, yet another negative study is being splashed across news channels; this one revealing an association between bacon consumption and bowel cancer. There is no shortage of studies claiming that red meat, in one way or another, is killing us. But is it really? And if it isn’t, why on earth is there so much science saying that it does? With the Easter Bank Holiday approaching, I’m sure you’re wondering whether that lovely roast lamb is putting you at risk… In order to answer the question, I am going to look in detail at one of the largest studies published to date on the subject. You may remember this one: it had every media outlet on the planet decrying the mortal dangers of red meat in 2012.

Science, and particularly nutritional science, is all about controlling for biases. And before I go any further, a quick word about the most critical bias of all: your own; my own. As Richard Feynman famously said:

“The first principle is that you must not fool yourself — and you are the easiest person to fool.”

We’re all guilty of this and I am no less guilty. Biases are so incipient that you don’t even notice them. So I should admit to you that my natural bias inclines me to believe that red meat is not killing us in any significant way. Certainly, I don’t believe that moderate quantities of unprocessed, high quality red meat is a big killer. In order to write this piece objectively I therefore had to pay careful attention to my own bias. If you want to get to the truth, you need to treat every piece of evidence with the same level of scrutiny. That’s what I have tried to do here. As an aside, there are a number of other reasons why you may choose not to eat red meat, or any meat at all. If you choose to avoid red meat because of ethical or environmental reasons, or because it just doesn’t work for you, those are entirely different, and valid, considerations.

Epidemiology and the criteria of good scientific research

Before we get into the details, we should have a set of criteria by which we judge scientific research. The study in question is epidemiological (or observational), as most studies in this field are. In this kind of experiment, scientists take large cohorts of individuals and over time record two sorts of data: one related to a hypothesised risk factor and the other related to a negative outcome of interest. In this case, the researchers set out to record data on red meat consumption (the risk factor) and death (the outcome). Once researchers have a sufficient quantity of data, they then use statistical techniques to examine the correlation, or association, between the risk factor and the outcome.

So what can go wrong with a study like this? The truth is, a lot can go wrong. First of all, studies like this can never prove causality. They can only prove statistical association and suggest causality. There is a huge difference. With correlation there is always the possibility that there is another variable, or set of variables, that is the true causal factor. Here is a good example: in Alzheimer’s disease, patients build up deposits of amyloid beta and tau protein in their brains. There is a very clear association between the build up of these proteins and Alzheimer’s disease. But scientists disagree about whether these plaques are the cause of the disease or the consequence of some other more fundamental cause such as vascular pathology or insulin resistance. Association does not equal causation.

In order to take epidemiological evidence as a strong indication of causality, the observed effects need to be very large. For example, having collected 50 years worth of data on smoking, Richard Doll et al. observed in 2004 that smoking increases your risk of developing lung cancer by 14 times. It’s unlikely with data like this that smoking is not causally linked to lung cancer in some way. Vanishingly few observational studies can present data with risk multiples anywhere near this large.

With that caveat in mind, there are at least three fundamental biases that observational studies need to mitigate: selection bias; data-gathering bias and confounding variable bias.

Selection bias occurs when the population being studied fails to be representative of the general population (or a subset of the population that you are interested in). Are the study’s subjects for some reason more or less at risk than the average person would be?

Data-gathering bias happens when data being recorded are inaccurate. Outcomes can normally be measured fairly precisely, but what about data regarding the risk factor?

Confounding variable bias. With epidemiological studies, it’s impossible to control for all variables. These are not lab-based experiments and you can’t hold other factors constant. Therefore, researchers have to use statistical techniques to adjust for the effects of other risk factors that are not included in the hypothesis. How well are they able to do this?

The headlines

Ok, so, what does this study say and should we believe it? Let’s look at effect size, selection bias, data-gathering and confounding variables in turn.

The headlines: researchers found that an additional serving of red meat per day increases your risk of mortality by 12%. They also found that an extra serving of processed red meat was associated with a 20% increase in mortality risk. Looking at specific diseases: cardiovascular disease and cancer mortality increased by 16% and 10% respectively for a 1 serving per day increase in red meat.

This was a huge study, which followed around 38,000 men in the Health Professionals Follow-up Study (HFPS) cohort and 84,000 women in the Nurses’ Health Study (NHS) cohort for 26 and 20 years respectively. So on face value, the data sounds fairly significant and the population being studied a reasonable representation of the general population. However, even bearing my natural bias in mind, things start to unravel rapidly when you get into details.

Effect size

Let’s look at effect size first. Increasing your mortality risk by 12% sounds significant. But is it really? Right off the bat, we know we are not dealing with a risk factor like smoking. Red meat increases your mortality rate by a factor of 1.12. Smoking increases your rate of lung cancer by a factor of 14. Smoking increases your risk of death by 83%; red meat by 12%. Ok, but even if red meat is not as deleterious as smoking, 12% is still something right?

In order to increase your risk of mortality by 12% you need to increase your red meat consumption by 1 serving per day. In the study, 1 serving is defined as 85g. So you’d need to eat ~600g of extra red meat per week in order to increase your mortality risk by 12%. That’s a lot of meat… Especially when you start looking at absolute effect size, rather than relative risk.

The mortality rate for men in the study was 1.18% per year and for women it was 0.68%. For men, an extra serving of red meat per day increases their mortality rate to 1.35%, or by 0.17% points. For women, an extra serving of red meat per day increases their mortality rate to 0.75%, or 0.17% points. Suddenly these risks start to look almost negligible. Can you think of any other risk factors that might increase your mortality risk by less than 1/5 of a percent? I’m sure there are quite a few.

That being said, this is the least problematic aspect of the study. If I could really be convinced that red meat consumption was associated with a 12% increase in mortality risk, I would still take notice.

Data-collection bias

The first major issue is with data-collection bias. The researchers collected data about participants’ red meat consumption using Food Frequency Questionnaires. In these surveys, subjects estimate the frequency and quantity of their consumption of various food items. I know from personal experience just how large a gap there often is between what people eat and what they remember eating. That’s just anecdotal of course, but if you’ve ever done a food diary you’ve likely had the enlightening experience of realising just how much you were eating and drinking with no conscious awareness. What’s more, subjects only filled in updated FFQs at either 2 or 4 year intervals. How accurately can you expect participants to report their average food intake over the preceding 2 or 4 year period? I often ask members and clients what they’ve eaten that day and they’re hazy on the details.

The authors anticipate this objection and address it in the text. They write:

“the FFQs used in these studies were validated against multiple diet records”

Here they are referencing studies which investigate the correlation between FFQs and actual dietary intake in the populations being studied (the HPFS and NHS cohorts). Ironically, these reports do little to reinforce the case for FFQs; quite the opposite in fact. One study, which examines the HPFS population, finds that for processed meat items the correlation between intake reported in FFQs and actual intake is only 0.52. For red meat, the correlation is between 0.5 and 0.59. These are moderate correlations at best. This inaccuracy is particularly problematic in the case of processed meats. While participants, on average, reported eating 0.22-0.26 servings per day, actual dietary records showed an average of 0.53 servings. So the average subject under-reports their processed meat consumption by a factor of more than 2. And this discrepancy is found to be statistically significant.

(Data presented as “mean +/- standard deviation”)

When the input data is this imprecise, can we make any inferences at all, let alone inferences of effects as small as a factor of 1.12? The problem with looking at such a broad category as a risk factor (red meat) is that the opportunity for errors on the input side are commensurately magnified. While someone might be able to reliably report how many cigarettes they’ve been smoking, how can they be expected to remember how often they’ve been eating the huge variety of items that make up the category in question: everything from beef, pork and lamb to hot dogs, burgers, bacon and cold-cuts?

Confounding variables

Potentially even more problematic than data-bias is the presence of multiple confounding variables and a clear relationship between these variables and red meat consumption. The paper’s authors split their cohorts into quintiles based on red meat consumption. If you look at the table describing the attributes of subjects in each of the 5 quintiles you can see a striking relationship: as red meat consumption goes up practically every other health risk factor worsens. In the study, people who eat more read meat also smoke more, drink more alcohol, eat more calories, do less exercise, have higher blood pressure and have higher incidence of diabetes.

 

Given the general perception that red meat is bad for you, those that eat less tend to be more health conscious and those that eat more, less health conscious. We have a perfect example of the healthy user bias, which I discussed here. You could argue that the real causal variable here is the degree of health consciousness of the subject. It is your level of health consciousness that determines your level of red meat consumption, which in turn is associated with mortality risk.

But the authors claim they have controlled for all these confounding variables by including them in their regressions. Can this really be done? A quick dive into the numbers raises some doubts. Bear with me here, because this is the crux of the matter. If you compare the mortality risks generated by these statistical regressions with the raw unadjusted mortality rates across the 5 quintiles of red meat consumption, something strange appears. Check out the numbers below for the HPFS cohort and total red meat consumption.

Q1 Q2 Q3 Q4 Q5
Regression Mortality Risk 1.00 1.12 1.21 1.25 1.37
Raw Mortality Rate 1.13 1.06 1.11 1.18 1.41

As you’d expect, in the author’s statistical regression mortality risk increases in a neat linear fashion with increasing meat consumption across the quintiles (Q1-Q5). However, that relationship is not preserved in the raw data. As you can see, the raw mortality rate for the lowest quintile of red meat consumption is higher than for the second and third quintiles. It is only in the fourth and fifth quintiles that the mortality rates exceed the lowest quintile. Now, if there were conflicting confounding risk factors, some of which were positively correlated with red meat consumption and other which were negatively correlated, you wouldn’t necessarily think much of this. But remember: every risk factor included in this study got worse as red meat consumption increased. Therefore, if there really was a relationship between red meat and mortality, you’d expect the raw data to reflect this and show mortality monotonically increasing with increasing consumption. As you can see, it doesn’t.

This trend is even clearer when we consider unprocessed red meat, as you can see below. For unprocessed red meat, only the mortality rate of the fifth quintile of consumption exceeded the first quintile. The mortality rate for quintiles 2-4 were actually lower than for quintile 1. That’s pretty staggering. The only conclusion I can draw from this is that there is no real association between red meat consumption and mortality risk. The effect seems to materialise out of convoluted statistical regressions. You only need to look at the raw data to see the reality.

Q1 Q2 Q3 Q4 Q5
Regression Mortality Risk 1.00 1.11 1.14 1.20 1.29
Raw Mortality Rate 1.23 1.15 0.99 1.21 1.30

I don’t consider this paper evidence to link red meat with chronic disease and death. It is epidemiological in nature which limits its power to imply causality from the outset. The absolute effects on mortality are small and associated with large alterations in red meat consumption habits. More fundamentally, the FFQ data is a seriously flawed method for assessing consumption and calls into question any statistical inference drawn from it. The authors have not been able to adequately control for confounding variables. Red meat consumption is associated with a number of other risk factors; an effect which is possible mediated by the fact that people generally believe red meat is unhealthy. Therefore, the relationship deduced between red meat and mortality is more likely the result of a healthy user bias. More health conscious people tend to live longer. And they also tend to eat less red meat.

Which all leaves one question: why do so many studies, and this one in particular, conclude that red meat is dangerous? I think the answer comes in two parts. The first part is that there is a prevailing and powerful bias: the status quo believes that red meat is deleterious to health. As the authors of this paper state in their introduction:

Substantial evidence from epidemiological studies shows that meat intake, particularly red meat, is associated with increased risks of diabetes, cardiovascular disease (CVD), and certain cancers”

If you think the current evidence is “substantial”, you are unlikely to be conducting research with a fully open mind. Add on to this bias the money and time spent on an enormous study like this and you have a recipe for unrigorous science. How likely is it that you will accept a null result when you thoroughly believe your hypothesis and have spent 20+ years and many millions of dollars trying to prove it? I’m not trying to allege any form of malpractice or malice on the part of the these researchers. I’m sure they are driven as much by a desire to help people as I am. They genuinely believe that red meat is dangerous and that they are aiding in a public health effort to save lives. But these pressures and biases result in unconvincing data.

What do you think?

Leave a Reply

Your email address will not be published. Required fields are marked *