Mimi Grishman (44:41)
So how do we think about type 2 diabetes from a functional medicine perspective? What's the root cause? Functional medicine is all about root cause. The root cause is something called insulin resistance. And this comes from eating a diet that's high in sugar, refined flour, grains, ultra processed food, there's no doubt about this. Also from lack of exercise and being sedentary, not moving enough, or being under muscled Right. Muscle is your metabolic Spanx, according to my friend JJ Virgin. And how do you address that? Well, you eliminate ultra processed food, processed grains, refined grains and starches, sweets, sugar sweetened beverages especially. And that improves your blood sugar balance and your insulin sensitivity. And what should you be eating then? Good quality protein and it can be meat. Okay, that's my view of the literature, not my opinion, but it's pretty much evidenced by the randomized control trials. Fiber, right. Fruits, vegetables, nuts, seeds, sometimes whole grains if you're not fully blown diabetic, healthy fats, olive oil, avocado oil, macadamia oil, none of these will will affect your blood sugar. And then you want to use testing to test your fasting glucose, your fasting insulin, your A1C triglycerides and other markers to understand if your insulin resistance. Now I co founded a company called called Function Health. You can go to functionhealth.com We've created an initial test of over 110 biomarkers. It's 4.99 a year membership and includes testing twice a year. And you get all the metabolic markers you need. You get insulin which your doctor almost never tests A1C or blood sugar. But you also look at lipid particle size we call lipoprotein fractionation. Not just your regular cholesterol profile but whether or not you have small particles, dense particles, large or small triglycerides or hdl. All these will tell you about your cardio metabolic health. We also measure inflammatory markers like C reactive protein and others so you get a really good understanding of where you're at. So go on check it out, go to functionhealth.com you can use the code young forever if you want to jump the wait list. But it's really a way to get testing to see what's going on with you and what's going on with your diet. So again test don't guess. Now it's no secret that navigating the realm of nutrition has become a challenge for the general public and even for people like me and healthcare professionals who've been studying this for 30 years. One week eggs are good for us, only be vilified for allegedly raising cholesterol levels. The next week, the narrative on dietary fats is no less tumultuous. And I wrote a whole book on this called E fat content. Some experts say that it's a chief culprit behind heart disease. Others say it's critical for overall health and well being. Well, more recently, a study made headlines linking red meat consumption to an increased risk of for type 2 diabetes, leaving the public once again confused and understandably so. And that's why in today's Health Bites episode, we're diving deep into the findings from this paper and unpacking the study's design flaws, its inaccuracies and where the researchers got it straight up wrong. The study was entitled Red Meat Intake and the risk of type 2 diabetes in a Prospective Cohort Study of United States Females and Males published in October of 2023. Now, this was a type of study design. It's important to understand study design because you have to understand science before you can interpret science. And you have to understand the type of studies that are done and which can show cause and effect and which can show correlation, not causation for example, every day I wake up and the sun comes up. It's 100% correlated, but it's 0% causal. You know, if I die tomorrow, the sun's going to keep coming up. If I slept through the middle of the day, the sun's going to keep coming up. So it has nothing to do with each other. And essentially that's what these observational studies, like this particular study did. They looked at correlation, not causation. And that means that we can't prove cause and effect. So when you hear the headline Red meat is linked to causing type 2 diabetes, it's BS, okay? We have to look at what the data show when it doesn't. And these studies are not wrong. They're not bad to do. They're done in order to help us understand what might be a useful avenue for further research. Right? They're not the end of the research. They're useful for generating hypothesis. For example, in the study of smoking and lung cancer, they did observational studies, right? They weren't going to do a randomized controlled trial because they're not talking about half people on cigarettes and half people are not on cigarettes. So basically they found that there was a 20 fold increase, maybe 10 to 20 fold increase in the risk of lung cancer in smokers. Now to put that in perspective, that's a thousand to two thousand percent increase in your risk of having a particular disease. And that ended up being correct because it was such a strong correlation. Whereas in this red meat diabetes study, to cut to the punch, it was about a 20% increase, right? Which essentially is relatively meaningless. And let's just say 200% increase in a correlation study, you pretty much want to ignore the data. And Dr. Ioannides from Stanford has written a lot about this, is an incredible scientist who's dissected the value of different types of studies and what we can learn from them and what we can't. So we have to start out really understanding that the study was not designed by its very nature, which all scientists would agree to prove cause and effect. It's just the nature of science. Okay, so let's get into the study. This is what we call a prospective cohort study. And it's an observational study, a population study, an epidemiological study, all means the same thing. Essentially it studies a group of individuals over time to look at the association between certain exposures, behaviors, diets and risk factors on specific outcomes. So basically, they track thousands of people over many, many years, looked at what they ate and saw if there was a correlation with diabetes and lo and behold, they found one. But let's talk about the problems with why this may not actually be as clear as the study seems to generate. Now, in this type of study, mainly people are identified based on their exposure status and then they're followed over time to observe and record outcomes. In other words, what did people eat over many decades and what, what was that diet? And was it correlated with any bad outcomes later in life? So you follow people for 30 years, you have them track their diet records, which we'll talk about in a minute, and then you see whether or not a particular food or types of food seems to correlate, not cause correlate with some bad outcome like diabetes. And that's what they did. And basically the goal is just to assess relationships between various insults, exposures, toxins, smoking, diet, whatever, and outcomes. So it essentially looks for things that may be worth further studying with a randomized control double blind trial. Okay, this was not done here. Now it can be helpful, but they say, well, we're going to control for variables we call confounding variables, which means things that kind of can throw the study off. In other words, we'll talk about this. But for example, there was a study done many years ago by the NIH and the aarp, the American association of Retired Persons, that looked at meat eating and chronic disease and death and cancer and so forth. They found a big correlation. But that study showed also that the people who ate meat didn't care about their health and smoked more, drank more, ate More calories, about 800 more a day, were more overweight, didn't eat fruits and vegetables, didn't exercise, just drank more alcohol, didn't take their vitamins, of course they had more disease. It wasn't because of the meat. It was, it was just a, we call it a problem that was shown because of these confounding variables. And we'll talk about more about that. Now this study was published in the American Journal of Clinical Nutrition and it was published by folks at Harvard who are great scientists, but they're focused on epidemiology, particularly at the school of Public health, which is where the study was published out of. And, and unfortunately, you know, people have bias and the study authors are very biased toward a plant based diet. And so right off the bat, you kind of look at. All right, well, they already have a bias and that affects the study, the outcome study. So basically the objective of this study was to assess the link between total processed and unprocessed red meat intake and type 2 diabetes. And then to estimate the effect of substituting different protein sources like vegetables, proteins, nuts, seeds, beans, grains for red meat and type 2 diabetes risk. So we're doing but again, just a hypothesis generating study. Now, again, this was a population study. It was based on the nurses health study, which was about 216,000 participants, the first and the second one, and the health professionals follow study, which was including men. Now, the first study started in 1976 female nurses and then another one 89 female nurses and health professional study was started in 86. And they followed people for a long period of time. They calculate the amount of years and people and they come up with a number called 5.4 million person years. So that's pretty good. And what they did was really interesting. They looked at something called a food frequency questionnaire. And this assesses people's diet every two to four years from the baseline. Now, can you remember what you had last Thursday for lunch? Do you remember the amount of this or that you had over the last week? Probably not. Right? And so these are flawed tools and there's a lot of research and science about how flawed these tools are and how imperfect they are and how often they are very misleading. We see that in this study. So the study findings, right, just to be clear, this is association correlation, not causation. They found between the lowest and the highest ready me intake, there was a risk of diabetes that went up by 62%, right? Not 200%, 62%. Processed meat associated with 51%. And unprocessed red meat was about 40% risk. If you substituted one serving of nuts or beans, then your risk was 30% lower. If you substituted for processed red meat, the risk was 41% lower. And unprocessed meat was about 29% lower. So they're basically saying if you had one serving of dairy for total processed or unprocessed red meat, you had a lower risk of type 2 diabetes. Now, this study is really important because it kind of misses a lot the point. What is the mechanism here? Now, they try to explain some of the mechanisms, but it's pretty weak. We know that the sugar that you eat, sugar and refined carbohydrates is the primary cause of type 2 diabetes, not red meat. And ancestrally, we've been eating meat for as long as we've been human. I just came back from the Maasai population in Africa, as I mentioned on different podcasts, and these people ate the blood, the milk and the meat of their cows. That was their main diet. They were healthy, they were super thin, they were very fit and they had no diabetes. I recently visited their community and the Coca Cola truck drives up every day. They get processed cookies from the local town that are made by the industrial food system and now they're gaining weight. And type 2 diabetes is rampant in this Maasai community in Africa. And it's just heartbreaking to see that within minutes this entire Coca Cola truck, a big truck, just was emptied out by the local population not knowing what they were doing themselves. And they didn't even know that it was connected. So you know this. Basically this study fueled a lot of clickbait headlines. For example, WebMD said just two servings of red meat per week raises diabetes risk. Well, that doesn't. It shows that it's correlated, but not causing. Eating red meat, sir, more than once a week is linked to type 2 diabetes risk. That's CBS. This is just bad reporting and bad journalism. And the social media was just all over the place, right? Some people were pro red meat, some people anti red meat. People are super confused and then nobody knows who to believe and everybody's distrusting public health and dietary guidelines and it's just a mess. So I'm going to try to unpack it for you so you really understand how to think about this and also how to actually know what to believe around this whole issue of red meat and diabetes and what we know. So basically the problem with this study, as we mentioned, is an observational study and we just cannot draw conclusions from an observational study. Doesn't prove causality and we have to look at also the limitations of the study. There are a lot of limitations. The study authors, for example, as I mentioned, are very biased toward a plant based diet and veganism. The how they pick the participants of the study, which may not be an issue, industry funding we want to look at that probably was an issue here. But there's this thing called recall bias which is common with food frequency questionnaires. People are more likely to report healthy food than unhealthy food. And desserts, sugar, sweetened beverages, alcohol are underreported. This is published. We're going to put all the references for everything I'm saying in the show notes. So have a look at those. Everything I'm saying is documented, is well researched and you can kind of dive in, but would take me about 10 hours if I covered every study in detail. So basically, you know, I've found this my practice. People overestimate how much extra they exercise and they underestimate how Much they eat, it's pretty difficult by humans are pretty flawed. Now the 2012 study from Red meat consumption and mortality, it looked at a prospective court. Studies from the people who ate a lot of red meat, about the highest 20%, had a 45% higher risk of dying from heart disease compared to those who ate the least red meat, the lowest 20%. However, when they looked more closely at the people in these extreme groups, they noticed that besides eating red meat, they had other habits that made them more likely to have heart disease, like don't exercise, they ate too much, they smoked, their cholesterol was worse. So. Or they maybe had fish consumption which affected their, you know, health and risks. For example, maybe the people in their lowest risk group exercised and didn't eat meat, but they also didn't smoke and they also ate healthier food. So you can't quite tell what's going on. So the study, you know, supports the idea that eating a lot of red meat is linked to high risk of heart disease. People who choose to eat more or less red meat have other lifestyle issues that influence their health. Now there are other factors, these confounding variables I mentioned, you know, when you look at confounding variables, they try to control for these but it's really tough like and they only, only control what research think to control for. And it basically makes it really hard to determine true cause and effect. Like I mentioned with the ARP study, they smoked more, they drank more, they ate less fruits and vegetables, they didn't exercise and all these other issues. That's why they had more disease, not because of the meat. So it's basically other issues with the study could be design flaws and maybe the study population is different from the regular population so it may not be widely generalizable. And also they do all these weird statistical calibrations to normalize the data. And we're going to talk about what that means. And they did this. In that study there was, I think a scientist named Roger Williams who said there's liars, damn liars and statisticians or maybe that was Mark Twain, I don't know. But I think, you know, I think it's true. You can kind of manipulate the data to make it show what you want. And that's clearly been done here. And the other thing this study does is it, it actually supports dietary guidelines to limit red meat consumption. And why does it say that? Well, I mean the study basically said our study supports this current dietary recommendations for limiting the consumption of red meat intake and emphasizes the importance of different alternative sources of protein for type 2 diabetes prevention. But dietary guidelines, just like this study, are heavily based on observational data. The data that can't prove cause and effect, and the systematic reviews and meta analysis of observational data are the weakest types of studies. Right? There's confounders, there's bias, There's a lot of problems with the studies. And often the researchers have ties to industry. The expert panels are not independent. It's kind of a mess. So how do we know what to do in science? Well, randomized controlled trials are the gold standard for drawing causal inferences between exposures and the outcomes. For example, you know, you, you give people placebo or blood pressure drug who have high blood pressure, and you follow them for three months and you can see, okay, well, did the people taking the placebo lower their blood pressure or the people on the pill? That's a randomized control trial. And you randomize people so they're not, you're not stacking the deck in favor of, you know, healthier or sicker population. Now, they're hard to do in nutrition because you need to control everything. And it's really hard to do. It's great in a lab rat, but it's not really easy in humans because they're what we call free living and they do whatever they want. So you say, well, I want you to eat a low fat diet, or I want you to eat a low carb diet, or I want you to, you know, exercise 150 minutes a week, or I want you to not smoke, or I want you to sleep eight hours a night, or whatever you want. You tell them they're not going to probably do it. And it's hard to do. You'd have to basically put people in a locked metabolic ward and put them there for years and give them the food that they eat and track everything they do in order to actually know what's going on, like a lab rat. But we really can't do that. We can't take, you know, 10,000 people and feed them a vegan diet and 10,000 people and feed them a omnivore diet, including red meat and healthy foods, follow them for 30 years and give them all the food and track that. It would be billions and billions of dollars and impossible to do. So it's not practical, it's not ethical, it's expensive, it's hard to recruit volunteers for this. And people just, it's hard to do these nutrition studies. So we have to do the best with the data we have, which are systematic reviews and meta analysis of randomized control trials, mechanistic studies, lab studies. There's many different levels of evidence, so you have to look at the total community benefit of all the evidence. So now let's dive into this problem of study design and what was wrong with this paper and why it does not prove that red meat causes type 2 diabetes. So what they did, as I mentioned before, they gave them a food frequency questionnaire. They're highly inaccurate. Right. Every two to three years, people get asked, what do they eat? And they got a questionnaire. What's their average intake of food and beverage over the last 12 months? Do you know what you ate over the last 12 months? I couldn't have a clue. I mean, how often do you remember eating X and Y food? Right? Do you eat chicken with the skin on or without the skin? You eat hamburgers, hot dogs, processed meats. They give all these questions. They also, you know, kind of weirdly track things like beef, pork and lamb as a sandwich or mixed dish. But no serving sizes were noted. You know, sandwiches in lasagna have also bread and pasta and processed carbs. So is that part of it? We don't know. So they basically kind of looked at, you know, what they were eating. The second issue is, and by the way, I can go way more into these food frequency questionnaires, but just trust me, based on the data, we'll put the links in the show notes, they're really highly inaccurate. They've really been proven to not be a good tool for looking at nutritional intakes over time and don't really correlate with a valid metric for tracking outcomes. So right off the bat, it's a tough study to do. The second issue, and I kind of mentioned it, is that the red meat definition included sandwiches and lasagna, which basically were counted twice and as processed and unprocessed red meat. Now, processed red meat is hot dogs, bacon meat, sandwiches, sausage. Unprocessed red meat is like hamburgers, beef, pork, lamb, a sandwich. So it's kind of weird. They kind of included other foods in the meat. So you have to be clear. The third issue is the serving size has changed over time. And why? Because the food frequency questionnaires were different in the different parts of the study. So One was in 1980, one was in 84 when I had 61 items and 120 items. And they basically changed the definitions of what a serving is even in these food frequency questionnaires. So it's super confusing. So the nurses in the study asked how often they consume two slices of bacon. Now, the serving size of bacon is one slice, but before it was two slices. Right? How they adjust for this? One serving of processed red meat is considered 45 grams. How did they measure it? Did they weigh their lunch meat? Did they take their bologna or salami and put it on scale? I doubt it. You know, what about chicken, beef, pork or lamb? They say six to eight ounces was a serving. Today one serving is three ounces. Did they know this? Did they translate a three ounce serving to a six to eight ounce equivalent? Probably not. And it creates more error in the studies. Issue 4 in the study was that this is really crazy. They use statistics to massage the data to have the outcome they want. It calls this process calibration. We're calibrating the results using a seven day weighted diet record and food frequency questionnaires from two other population studies. In other words, they kind of acknowledge that food frequency questionnaires are not that accurate. So they're going to use other ones to correlate and see if they can kind of create this mishmash of data to show what they want. So what they found was that this is kind of crazy. The calibration doubled the effect for total red meat, processed meat and unprocessed red meat. So before the calibration, for example, one serving, an increment of total red meat was associated with a 28% higher risk of diabetes. After the calibration it was 47%. Before the calibration, one serving increment of processed red meat was associated with a 50 high risk of diabetes. After it was 101%. So it's like, what are you doing here? Right. So guess what number was repeat reported in the headlines. Not the uncalibrated but the calibrated number. Right. Too much red meat is linked to a 50 increase in type 2 diabetes. Well, in NPR they didn't really do a good job of doing a review of the study. They didn't do investigative journalism, which I think is sorely lacking. And basically they. They found that there's 50% increase in red meat. So like I said, before the calibration was 28%. After it was 47%. The next issue was the authors compared the lowest intake of red meat to the highest intake, but have historically reported the risk using servings and for example, which is a more quantitative metric. So just to explain what that means, in the 2011 paper, another one called red meat consumption the risk of type 2 diabetes, three cohorts of US adults and an updated meta analysis, they reported 12% risk of diabetes for one serving and 32% for processed meat and 14% for total red meat. But this paper compared the highest to lowest intakes claiming a 51% increased risk for eating unprocessed and 101% increased risk for processed and 40% for total. But basically this method using qualitative versus quantitative generated a lot more headline worthy statistics. So in other words, the way they reported this, it just makes it more sensational and look better for the agenda of having a study show that red meat causes diabetes. Another thing with the study is the women in this study, right, the nurse self study compared to the men in this study show that the women ate more red meat than the men. Now this is the first study ever to claim this. Now typically every other study has shown the opposite. So what does that mean? Well, I don't know, but it just seems to kind of be a clue that maybe the study is a little wacky and doesn't comport with all the other data we have around meat consumption and being female and male. The next issue was the total red meat intake had a higher risk of diabetes than both processed and unprocessed red meat. So that doesn't make sense. Right. If you, if you, how could this, the total red meat be worse than the individual types of red meat when the total is a sum of both of them, Right? So you don't get like one plus one equals three. It doesn't make sense. So most studies are looking at the risk associated with red meat, show that the processed meat is riskier than unprocessed red meat and total red meat. The sum falls in between, right? So if you have processed red meat being a higher risk and unprocessed lower risk, the average risk is going to be lower, Right? Kind of a combination. But in this study they found the opposite, which doesn't make any sense. If red meat it's processed makes you have a higher risk of diabetes and unpressed rest reciprocity lower, then if you add them together, you shouldn't have a higher risk when you combine them. So it doesn't make sense. The next issue of the study was what we call healthy user bias. And I think this is really, really important. Essentially it's talking about what I mentioned earlier, which is the idea of confounders, this idea of why were the people in the study having more diabetes or not? Was it because of the meat they were eating or a bunch of other habits? Right. The people in this study, when you look at their characteristics, they had much higher body mass index. In other words, they were heavier, they were less physically active, they were more likely to be smokers, and they were less likely to take vitamins. Right. So of course they're going to have more risk. Right. So the healthier people didn't eat red meat. Why? Because they thought that red meat is bad. That's the propaganda that we have in our society, which is red meat causes heart disease, red meat causes cancer. So we should be eating less meat. In fact, we are, which is really another really important point. When you look at the amount of meat we're eating, it's dramatically decreased over the last 30, 40 years. Dramatically. Because the message in the public health domain has been to eat less meat. But at the same time, what's happened, the risk of diabetes has skyrocketed.