Lack of Sun in Teen Years Linked to Nearsightedness Later On

A woman sits in the sunshine
(Image credit: Yeko Photo Studio/Shutterstock.com)

Teens and young adults who spend more time outdoors may be less likely to become nearsighted later in life than those who spend less time outdoors, a new study suggests. People in the study who spent more time exposed to ultraviolet B (UVB) radiation — which the researchers calculated based on the participants' exposure to sunlight — between ages 14 and 39 were less likely to be nearsighted at 65 than those who spent less time exposed to UVB radiation, the researchers found.

"Increased UVB exposure was associated with reduced myopia, particularly in adolescence and young adulthood," the researchers wrote in the study, published yesterday (Dec. 1) in the journal JAMA Ophthalmology. Myopia is a term that eye doctors use for nearsightedness, where people can more clearly see objects if they are closer. [5 Experts Answer: What's the Best Way to Preserve My Eyesight?]

In the study, the researchers looked at 371 people with nearsightedness and 2,797 people without nearsightedness who lived in various locations in Europe, including Norway, Estonia, France, Italy, Greece, Spain and the United Kingdom. The people in the study were 65 years old, on average.

Trained researchers examined the participants' eyesight, and collected blood samples to examine the levels of vitamin D in their blood. They did that because previous research had linked higher vitamin D concentrations to a lower risk of nearsightedness.

These researchers also interviewed the participants – they asked not only about their educations levels, diets and medical histories, but also about how much time the people had spent outdoors between 9 a.m. and 5 p.m. since they  were 14 years old up to their current age.

The researchers then used the information about the participants' histories of exposure to sunlight and their geographical locations to calculate the levels of different types of outdoor sunlight wavelengths, including UVB wavelengths, the people had been exposed to.

It turned out that people who had been exposed to higher levels of UVB radiation — a factor that's closely related to how much time a person spends outdoors and is exposed to sunlight — as teens and young adults were less likely to be nearsighted at age 65 than those who had been exposed to lower levels of UVB radiation. This is in line with previous research, published in 2015 in the journal JAMA, that suggested that children who spent more time outdoors had a lower risk of becoming nearsighted.

However, in contrast to previous research, the new study did not find a link between higher levels of vitamin D and a person's risk of developing nearsightedness, the researchers said. [9 Good Sources of Disease-Fighter Vitamin D]

The new study shows a link between higher levels of exposure to UVB radiation and a lower risk of nearsightedness, but it does not prove that there is a cause-and-effect relationship between the two.

It is not clear exactly why UVB radiation or exposure to sunlight may be linked to a lower risk of nearsightedness, the researchers said. However, previous research suggested that sunlight might help stimulate the activation of certain cells in the eye, and may modulate a certain type of growth in the eye that is linked to nearsightedness, the researchers said.

Dr. Jules Winokur, an ophthalmologist at Lenox Hill Hospital in New York, who was not involved in the study, said that the study was interesting, but had certain limitations. For example, it relied on people's recollections of how much time they had spent outdoors many years ago, when they were teens, which may not be a reliable or accurate source of this type of information, he said.

More research is needed to assess the relationship between people's exposure to sunlight and their risk of nearsightedness, Winokur said.

Originally published on Live Science.

Staff Writer
TOPICS