Hemoglobin Levels in Runners: When Low Is Normal vs. Performance-Limiting
Table of Contents
Introduction
Hemoglobin levels in runners are often lower than expected, and this frequently causes confusion when reviewing routine blood tests. In many endurance athlete patients, hemoglobin levels are naturally lower than in the general population due to training-related adaptations. As a result, some athletes are mistakenly classified as anemic despite having normal iron stores and no true impairment in oxygen-carrying capacity.
Research makes clear that low hemoglobin in a runner carries two completely different meanings depending on the cause. One is a physiological adaptation that can actually enhance performance. The other is a genuine nutritional deficit that quietly erodes your endurance, recovery, and race results. Knowing which one you’re dealing with is one of the most practically important pieces of information a runner can have.
Interpreting hemoglobin levels in athletes is often challenging for clinicians and requires particular care. For this reason, I wrote this article to help clarify the distinction between physiological adaptation and true deficiency. Understanding hemoglobin levels in runners requires context, as the same value can reflect either normal adaptation or true deficiency.
The Prevalence Problem: Why Hemoglobin Levels in Runners Confuse Clinicians
Iron deficiency is common in female endurance athletes and may affect up to 60% in some cohorts, while approximately 19.7% of competitive athletes across all sports show iron deficiency sufficient to impact performance [1][2]. At the same time, a substantial number of runners in these populations show hemoglobin readings below standard reference values for entirely benign reasons. The challenge is that a single number on a blood test cannot distinguish between these very different scenarios.
Standard hemoglobin reference ranges were not built with endurance athletes in mind. They reflect resting sedentary populations. Reference ranges are typically defined so that 95% of the general population falls within them statistically. When those thresholds are applied to runners, they routinely flag healthy, well-adapted athletes as anemic—generating unnecessary concern, unnecessary supplementation, and sometimes unnecessary diagnostic workups.
Sports Pseudoanemia: The Adaptation That Looks Like a Problem
In endurance physiology, hemoglobin levels in runners often decrease as a result of plasma volume expansion rather than true anemia. When you train consistently for running, your cardiovascular system undergoes a series of adaptations designed to improve oxygen delivery and thermoregulatory efficiency. One of the most significant—and most misunderstood—is a deliberate expansion of plasma volume.
Endurance exercise training expands plasma volume by 9–25%, adding approximately 300–700 mL of additional fluid to the bloodstream [3]. This happens through a well-characterized hormonal cascade involving aldosterone, vasopressin, and albumin production, and it begins within hours of sustained exercise. The mechanism is not pathological—it is adaptive. Greater plasma volume improves cardiac output, enhances blood flow to working muscles, and increases the body’s capacity to dissipate heat during prolonged exercise. From a performance perspective, oxygen-carrying capacity is determined more by total hemoglobin mass than by hemoglobin concentration alone.
However, this expansion creates an arithmetic problem on a standard blood test. Hemoglobin is measured as a concentration—grams per deciliter of blood. When plasma volume increases substantially but red blood cell mass increases more slowly, the concentration of hemoglobin appears to fall even when the absolute amount of hemoglobin in the body has not decreased. The result is a dilutional effect that makes a well-adapted runner appear anemic when they are not [4].
This phenomenon is well-established in literature. In studies comparing trained endurance athletes to sedentary controls, 11.7% of male distance runners and 11.7% of female distance runners recorded hemoglobin values below standard reference ranges—yet iron deficiency, as defined by depleted iron stores, was present in only 3.3% and 5% of those groups respectively [4]. In many trained runners with normal iron indices, these low hemoglobin concentrations are best explained by expanded plasma volume rather than true anemia—a marker of superior cardiovascular conditioning.
Before treating anemia in an athlete, I always make sure to assess iron status comprehensively and distinguish true iron deficiency anemia from sports pseudoanemia. In my experience, this distinction often becomes clear when iron parameters remain within normal ranges despite a low hemoglobin concentration. Relying on hemoglobin alone can easily lead to unnecessary or inappropriate treatment. For a practical guide on how I approach this in clinical practice, see my article on iron panel interpretation for athletes.
True Iron Deficiency: When Low Hemoglobin Actually Matters
While some changes in hemoglobin levels in runners are physiological, others reflect true iron deficiency with meaningful consequences. Not every low hemoglobin in a runner reflects a healthy adaptation. A meaningful proportion represents genuine iron depletion with real consequences for performance and health. The distinction is clinically important because the two conditions call for opposite management responses.
Iron is the rate-limiting nutrient for hemoglobin synthesis, and runners face multiple simultaneous mechanisms that deplete iron stores faster than dietary intake can replace them.
Foot-strike hemolysis is a mechanical process unique to running. Every foot impact transmits forces through the plantar capillaries, physically lysing red blood cells that pass through them. Post-race studies and systematic reviews show reductions in haptoglobin with accompanying rises in reticulocyte count, consistent with transient foot-strike hemolysis [5]. Footwear and surface characteristics may influence the degree of foot-strike hemolysis, though they do not remove the phenomenon entirely.
Exercise-induced hepcidin elevation is arguably the more insidious mechanism. Hepcidin is the liver’s master regulator of iron absorption, and it is triggered by the inflammatory cytokine interleukin-6 (IL-6) released during exercise. A prolonged bout of running produces a significant increase in hepcidin that persists for several hours post-exercise, during which it actively suppresses iron absorption from the gut [6]. Research with trained collegiate cross-country runners confirmed that a single prolonged run measurably decreases dietary iron absorption through this mechanism [6]. Iron absorption can be reduced for several hours after prolonged running, so iron-rich meals or supplements may be better timed away from hard sessions rather than immediately after exercise [6].
Gastrointestinal microbleeding compounds the problem. Sustained running redistributes blood away from the gut, and the resulting ischemia can compromise intestinal mucosal integrity, producing small but chronic blood losses. This is particularly relevant during long races and high-mileage training phases.
Menstrual losses add a further iron burden in female runners. The combination of menstrual iron loss, elevated hepcidin, foot-strike hemolysis, and often-inadequate dietary iron intake places female athletes at substantially higher risk—a 2024 study of 1,190 competitive athletes found that female sex independently increased the odds of iron deficiency by a factor of 4.35 [2]. In female athletes with a history of heavy menstrual bleeding, I have a low threshold for referring them for gynecological evaluation, as addressing the underlying cause is often an essential part of managing recurrent iron deficiency.
In my clinical practice, I frequently encounter iron deficiency in athletes, particularly among female athletes, who are at higher risk. Heavy menstrual bleeding is a significant contributing factor. In addition, dietary patterns such as vegetarian or energy-restricted diets can increase the risk, especially in sports where maintaining a lower body weight or specific physique is emphasized.
What often makes this more challenging is that the symptom profile can overlap with general fatigue. Many athletes are highly disciplined individuals who balance intense training with work or studies, and the resulting cumulative load can be substantial. In this context, it is not always straightforward to distinguish iron deficiency from broader fatigue, overreaching, or even overtraining syndrome without careful evaluation.
The Performance Impact of True Iron Deficiency
Understanding the performance cost of true iron deficiency helps clarify why the distinction from pseudoanemia matters so much.
Iron deficiency in athletes, even before hemoglobin falls to clinically anemic levels, reduces maximal aerobic capacity. A 2024 study of 1,190 athletes found that iron-deficient athletes had measurably lower VO2 peak values (43.4 vs 45.6 mL/min/kg) and were significantly less likely to achieve a VO2 peak above 50 mL/min/kg compared to iron-sufficient athletes [2]. These are not trivial differences in a performance context—they represent the gap between competitive and non-competitive.
A systematic review of 23 studies encompassing 669 female athletes found that iron deficiency reduced endurance performance by 3–4% [1]. In endurance sport, where races are decided by margins of seconds, this magnitude of impairment is disqualifying. The same review found that endurance performance improved by 2–20% when iron-deficient athletes were treated with appropriate supplementation for up to 56 days [1].
The mechanism for performance impairment goes beyond hemoglobin. Iron is a cofactor for cytochrome enzymes in the mitochondrial electron transport chain, for myoglobin in muscle oxygen storage, and for multiple enzymatic processes in energy metabolism. Iron deficiency without clinical anemia can still impair oxidative capacity at the cellular level.
Beyond a decline or plateau in performance, my patients with iron deficiency often report pronounced fatigue, slower recovery, and a reduced tolerance for training load. These symptoms can develop even before anemia is present, reflecting iron’s essential role in cellular energy production and muscle function.
Assessment: Reading the Markers That Actually Matter
A standard complete blood count that returns a hemoglobin of, say, 12.5 g/dL in a female runner tells you almost nothing useful in isolation. To distinguish sports pseudoanemia from true iron deficiency, a more complete picture is needed.
Serum ferritin is the most important marker. As the primary indicator of stored iron, it falls well before hemoglobin drops—making it the earliest detectable signal of iron depletion. Standard laboratory reference ranges define iron deficiency at ferritin below 12–15 µg/L. These thresholds were not designed for athletes and commonly miss clinically relevant iron depletion in this population.
In sports medicine literature, ferritin below approximately 30 µg/L is commonly treated as absolute iron deficiency in athletes, while values in the 30–99 µg/L range may still be suboptimal for performance depending on symptoms, transferrin saturation, training context, and inflammatory status [7][8]. These are not universally settled consensus cutoffs—debate continues about optimal thresholds—but they represent a widely used framework that better reflects the iron demands of endurance sport than standard clinical ranges. The systematic review of female athletes specifically used a ferritin threshold of <40 µg/L to define iron deficiency [1]. For a more detailed explanation of ferritin interpretation in athletes, see my guide on ferritin levels for athletes.
Transferrin, total iron-binding capacity (TIBC), and transferrin saturation provide complementary information about iron transport. Transferrin is the main protein that carries iron in the blood, while TIBC reflects the blood’s overall capacity to bind iron via transferrin. Transferrin saturation describes how much of that capacity is actually occupied by iron.
Together, these markers help flag iron deficiency that ferritin alone might not capture, particularly when ferritin is artificially elevated by inflammation (ferritin is an acute-phase reactant and rises during illness, intense training, or stress, temporarily masking depleted stores).
Mean corpuscular volume (MCV) and mean corpuscular hemoglobin (MCH) help distinguish dilutional pseudoanemia from true iron deficiency anemia: these markers are often normal in pseudoanemia and may fall once iron deficiency becomes established enough to produce microcytic, hypochromic red cells. For a more detailed discussion of how MCV changes in athletes, see our guide.
In my clinical practice, many athletes—at least in Finland—train on a relatively tight budget. A large proportion are students, and only a minority have access to sponsorship or comprehensive healthcare coverage. As a result, regular full iron panels are often not financially feasible.
When athletes come to see me wanting to rule out anemia or iron deficiency, I usually start with the most practical and cost-effective tests. In this context, serum ferritin and a basic complete blood count, including hemoglobin and MCV, provide a useful first-line assessment of iron status. More extensive markers such as transferrin saturation, soluble transferrin receptor (sTfR), and related tests are used more selectively in my practice, typically when the clinical picture is unclear or when broader testing is accessible through occupational healthcare, insurance, or higher-level athletic support.
Evidence-Based Solutions When Iron Deficiency Is Confirmed
If assessment confirms true iron deficiency rather than pseudoanemia, the approach is nutritional correction first, supplementation when dietary measures are insufficient.
Dietary optimization should prioritize heme iron from animal sources (red meat, poultry, fish), which carries an absorption rate of 15–25% compared with 2–5% for non-heme iron from plant sources. Pairing non-heme iron sources with vitamin C significantly enhances absorption. Coffee, tea, and calcium consumed around iron-rich meals competitively inhibit absorption and should be timed away from iron intake.
Supplementation timing is the key practical lever that research has clarified in recent years. Because prolonged running increases hepcidin and can reduce iron absorption for several hours afterward, athletes may absorb iron better when supplements or iron-rich meals are timed away from hard sessions rather than taken immediately post-exercise. Ideally, supplementation occurs on rest days or in the morning before the day’s first training session [6].
Alternate-day dosing has emerged as the evidence-based approach for maximizing absorption. When iron supplements are taken on consecutive days, the post-dose hepcidin elevation from the first dose is still active when the second dose is consumed, reducing fractional iron absorption by 35–45% [9]. Allowing 48 hours between doses—the time required for hepcidin to return to baseline—restores absorption efficiency. Studies in iron-deficient women confirm that fractional iron absorption is 40–50% higher with alternate-day compared to consecutive-day dosing [10]. This approach also typically improves gastrointestinal tolerability, which is the most common reason athletes discontinue supplementation prematurely.
When iron deficiency is confirmed in an athlete, my first-line approach is always dietary optimization together with oral iron supplementation. I emphasize increasing intake of heme iron, as discussed earlier—primarily from sources such as red meat—when this is feasible for the athlete. Alongside this, iron supplementation is often necessary to restore iron stores effectively.
I generally do not recommend intravenous iron infusions unless there is a clear medical indication. In recent years, “elective” iron infusions have become more common in some settings, but these should not be used without appropriate clinical justification. In my practice, intravenous iron is reserved for cases with clearly documented iron deficiency, particularly when oral supplementation is ineffective, not tolerated, or when a more rapid correction is clinically justified.
Summary
Low hemoglobin levels in runners is not a diagnosis in itself, but a signal that requires proper interpretation. In endurance athletes, reduced hemoglobin concentration is often a normal consequence of plasma volume expansion and does not reflect impaired oxygen-carrying capacity. At the same time, true iron deficiency remains common—particularly in female athletes—and can significantly impact performance, recovery, and overall well-being, even before anemia develops.
The key distinction lies in context. Hemoglobin alone is not sufficient; a structured assessment of iron status, including ferritin and selected supporting markers, is essential to differentiate physiological adaptation from clinically relevant deficiency. In my clinical practice, this distinction is critical, as it directly determines management—ranging from reassurance in cases of sports pseudoanemia to targeted dietary and supplementation strategies when iron deficiency is confirmed.
For runners and clinicians alike, understanding this difference prevents both under-treatment of true deficiency and over-treatment of normal adaptation. When interpreted correctly, blood markers do not just flag problems—they provide a clear roadmap for optimizing both health and performance.
References
- https://doi.org/10.1016/j.jshs.2024.101009
- https://doi.org/10.1016/j.nut.2024.112516
- https://pubmed.ncbi.nlm.nih.gov/1553454/
- https://pubmed.ncbi.nlm.nih.gov/1521949/
- https://pmc.ncbi.nlm.nih.gov/articles/PMC11698231/
- https://pubmed.ncbi.nlm.nih.gov/35661896/
- https://pmc.ncbi.nlm.nih.gov/articles/PMC10608302/
- https://pubmed.ncbi.nlm.nih.gov/21145121/
- https://pubmed.ncbi.nlm.nih.gov/26289639/
- https://pubmed.ncbi.nlm.nih.gov/31413088/
