During the course of the 20th century, average life expectancy skyrocketed by 57 percent, from about 49 years of age in 1901 to 77 years by century’s end.
At first glance, the numbers might indicate that early 20th century society was bereft of grandparents. That's far from the truth.
In 1999, the U.S. Centers for Disease Control and Prevention put together a list of the top ten greatest public health achievements of the previous century. The list included:
- Vaccination
- Improvements in motor-vehicle safety
- Safer workplaces
- Control of infectious diseases
- Decline in deaths from coronary heart disease and stroke
- Safer and healthier foods
- Healthier mothers and babies
- Family planning
- Fluoridation of drinking water
- Recognition of tobacco use as a health hazard
But while all of these factors helped to add a few more years to the average American adult's life, their combined effect was overshadowed by one other that often goes unmentioned: reduced infant mortality, which is the risk of death during the first year of life.
Between 1950 and 2001, infant mortality in the United States dropped from nearly 30 deaths for every 1,000 live births to just seven.
Reduced infant mortality, more than any single contributing factor, has been responsible for the spectacular rise in American life expectancy. The reason is simple: for all of the other factors on the CDC’s list to have any effect, a baby must survive to childhood.
According to the CDC, declines in infant mortality over the past five decades have been largely due to improved health care access, advances in neonatal medicine and public health campaigns such as “Back to Sleep,” which was designed to curb fatalities from Sudden Infant Death Syndrome, or SIDS.
Sign up for the Live Science daily newsletter now
Get the world’s most fascinating discoveries delivered straight to your inbox.