When officials at the USDA came up with the recommended daily allowances (RDA), they primarily studied populations and individuals who consumed high-carb, grain-based diets. In a 2007 Institute of Medicine review of the RDA, several speakers asserted that the Dietary Reference Indices should be based on a higher standard of evidence than what had been used to formulate the recommendations. Basically, the RDAs are more or less a guess, and they certainly weren’t formulated by evaluating people who were eating low-carb or (heaven forbid!) meat-only diets. Consequently, we have no real idea of what the optimal or even sufficient levels of vitamins and minerals are for various subsets of dieters. For now, the entire dietary profession uses this low-quality evidence for the basis of almost all the current recommendations.
We’ve seen evidence of other differences in requirements for some vitamins, minerals, and cofactors. A deficiency of thiamine, for example, leads to a condition called beriberi, which results in severe neurological and cardiac disease. Researchers have found that an animal’s requirements for thiamine vary based on that animal’s carbohydrate consumption. This result was observed as far back as the late 1800s when scientists noted that animals fed a low-carbohydrate diet didn’t develop disease in the presence of low thiamine levels, but animals fed a high-carbohydrate diet developed disease at the same low thiamine levels.
Magnesium is a mineral that’s crucial for many human physiologic functions. Recently magnesium deficiency has been implicated as a potential source of numerous disease states. Interestingly, magnesium is a cofactor that is crucially involved in carbohydrate metabolism, and there is some research showing a relationship between blood glucose and magnesium levels. Is it possible that many people are identified as having a magnesium deficiency because of increased demand via high rates of carbohydrate ingestion? It’s certainly an interesting question, and that relationship would account for the lack of any clinically relevant nutrient deficiencies in our observations of the modern-day carnivore-dieter population.
Unfortunately, it’s challenging to make assessments about vitamin or mineral deficiencies. We can look for overt clinical symptoms and more subtle subclinical things like poor energy, sleep, or mood. Aside from those symptoms, we’re often limited to studying the things we can measure most easily, which generally comes down to a blood test.
For all the billions of dollars we spend annually on blood tests, the sad fact is that many are poor predictors of chronic issues. Sure, sometimes we can get important information from a blood test, but to think that a blood serum vitamin C level can tell us something specific, such as the cellular concentration of the vitamin C level in our left tibia, is misguided. Perhaps at steady state, when no environmental or internal changes are occurring, a certain level can be expected to exist, but the truth is that trafficking of materials in the blood can vary wildly.
Does sleep, exercise, recent meals, temperature, time of year, injury, or illness (not to mention thousands of other things) affect those concentrations? Almost certainly, the answer is yes. Another solution for identifying problems is to biopsy the tissues, which gives a far better representation of one’s nutritional state. The problem is that biopsies often are fairly painful, they require far more risk, and they’re expensive. Thus, we continue to rely on unreliable guesswork to make many of our decisions about how to address health issues.
One of the recurring themes that I like to talk about is that, despite what many people like to proclaim, the science of nutrition is not settled. (Stating that science is settled would completely undermine the basic concept of science.) Take this theory, for example: Red meat causes diabetes. The evidence in support of this theory would be based on population survey data that shows that people who eat more red meat have higher rates of diabetes. There’s nothing wrong with that theory as long as the data continues to support that claim.
However, what if you have information to the contrary—such as numerous accounts of people who eat only red meat and notice that their diabetes resolves? At this point, you have to adjust your hypothesis and modify your theory. You could say that maybe it was some other factor common to those meat eaters with diabetes that caused the disease; in other words, maybe meat combined with something else is to blame.
Unfortunately, we live in a time when entire industries and careers are built upon a particular hypothesis, and even in the face of new or overwhelming evidence, some people are unwilling to revisit or revise their original assumptions. This is human nature and to be expected. The unfortunate part is that those assumptions can affect many lives around the world, and many billions of dollars are tied up in it.
Here’s a general question to ponder before we go on: Why is it that every wild animal that eats meat as part of its diet doesn’t suffer from the chronic diseases that modern humans do? How can a food source that is ubiquitous throughout the animal kingdom and has been clearly eaten by humans for millions of years now suddenly be toxic to only humans while every other animal is just fine?