Tourism

Is Sunscreen the New Margarine?

Current guidelines for sun exposure are unhealthy and unscientific, controversial new research suggests—and quite possibly even racist. How did we get it so wrong?

Facebook Icon

Twitter Icon

sms

email

These are dark days for supplements. Although they are a $30-plus billion market in the United States alone, vitamin A, vitamin C, vitamin E, selenium, beta-carotene, glucosamine, chondroitin, and fish oil have now flopped in study after study.

If there was one supplement that seemed sure to survive the rigorous tests, it was vitamin D. People with low levels of vitamin D in their blood have significantly higher rates of virtually every disease and disorder you can think of: cancer, diabetes, obesity, osteoporosis, heart attack, stroke, depression, cognitive impairment, autoimmune conditions, and more. The vitamin is required for calcium absorption and is thus essential for bone health, but as evidence mounted that lower levels of vitamin D were associated with so many diseases, health experts began suspecting that it was involved in many other biological processes as well.

And they believed that most of us weren’t getting enough of it. This made sense. Vitamin D is a hormone manufactured by the skin with the help of sunlight. It’s difficult to obtain in sufficient quantities through diet. When our ancestors lived outdoors in tropical regions and ran around half naked, this wasn’t a problem. We produced all the vitamin D we needed from the sun.

But today most of us have indoor jobs, and when we do go outside, we’ve been taught to protect ourselves from dangerous UV rays, which can cause skin cancer. Sunscreen also blocks our skin from making vitamin D, but that’s OK, says the American Academy of Dermatology, which takes a zero-tolerance stance on sun exposure: “You need to protect your skin from the sun every day, even when it’s cloudy,” it advises on its website. Better to slather on sunblock, we’ve all been told, and compensate with vitamin D pills.

Yet vitamin D supplementation has failed spectacularly in clinical trials. Five years ago, researchers were already warning that it showed zero benefit, and the evidence has only grown stronger. In November, one of the largest and most rigorous trials of the vitamin ever conducted—in which 25,871 participants received high doses for five years—found no impact on cancer, heart disease, or stroke.

How did we get it so wrong? How could people with low vitamin D levels clearly suffer higher rates of so many diseases and yet not be helped by supplementation?

As it turns out, a rogue band of researchers has had an explanation all along. And if they’re right, it means that once again we have been epically misled.

These rebels argue that what made the people with high vitamin D levels so healthy was not the vitamin itself. That was just a marker. Their vitamin D levels were high because they were getting plenty of exposure to the thing that was really responsible for their good health—that big orange ball shining down from above.


One of the leaders of this rebellion is a mild-mannered dermatologist at the University of Edinburgh named Richard Weller. For years, Weller swallowed the party line about the destructive nature of the sun’s rays. “I’m not by nature a rebel,” he insisted when I called him up this fall. “I was always the good boy that toed the line at school. This pathway is one which came from following the data rather than a desire to overturn apple carts.”

Weller’s doubts began around 2010, when he was researching nitric oxide, a molecule produced in the body that dilates blood vessels and lowers blood pressure. He discovered a previously unknown biological pathway by which the skin uses sunlight to make nitric oxide.

It was already well established that rates of high blood pressure, heart disease, stroke, and overall mortality all rise the farther you get from the sunny equator, and they all rise in the darker months. Weller put two and two together and had what he calls his “eureka moment”: Could exposing skin to sunlight lower blood pressure?

Sure enough, when he exposed volunteers to the equivalent of 30 minutes of summer sunlight without sunscreen, their nitric oxide levels went up and their blood pressure went down. Because of its connection to heart disease and strokes, blood pressure is the leading cause of premature death and disease in the world, and the reduction was of a magnitude large enough to prevent millions of deaths on a global level.

Wouldn’t all those rays also raise rates of skin cancer? Yes, but skin cancer kills surprisingly few people: less than 3 per 100,000 in the U.S. each year. For every person who dies of skin cancer, more than 100 die from cardiovascular diseases.

People don’t realize this because several different diseases are lumped together under the term “skin cancer.” The most common by far are basal-cell carcinomas and squamous-cell carcinomas, which are almost never fatal. In fact, says Weller, “When I diagnose a basal-cell skin cancer in a patient, the first thing I say is congratulations, because you’re walking out of my office with a longer life expectancy than when you walked in.” That’s probably because people who get carcinomas, which are strongly linked to sun exposure, tend to be healthy types that are outside getting plenty of exercise and sunlight.

Melanoma, the deadly type of skin cancer, is much rarer, accounting for only 1 to 3 percent of new skin cancers. And perplexingly, outdoor workers have half the melanoma rate of indoor workers. Tanned people have lower rates in general. “The risk factor for melanoma appears to be intermittent sunshine and sunburn, especially when you’re young,” says Weller. “But there’s evidence that long-term sun exposure associates with less melanoma.”

These are pretty radical words in the established dermatological community. “We do know that melanoma is deadly,” says Yale’s David Leffell, one of the leading dermatologists in the country, “and we know that the vast majority of cases are due to sun exposure. So certainly people need to be cautious.”

Still, Weller kept finding evidence that didn’t fit the official story. Some of the best came from Pelle Lindqvist, a senior research fellow in obstetrics and gynecology at Sweden’s Karolinska Institute, home of the Nobel Prize in Physiology or Medicine. Lindqvist tracked the sunbathing habits of nearly 30,000 women in Sweden over 20 years. Originally, he was studying blood clots, which he found occurred less frequently in women who spent more time in the sun—and less frequently during the summer. Lindqvist looked at diabetes next. Sure enough, the sun worshippers had much lower rates. Melanoma? True, the sun worshippers had a higher incidence of it—but they were eight times less likely to die from it.

So Lindqvist decided to look at overall mortality rates, and the results were shocking. Over the 20 years of the study, sun avoiders were twice as likely to die as sun worshippers.

There are not many daily lifestyle choices that double your risk of dying. In a 2016 study published in the Journal of Internal Medicine, Lindqvist’s team put it in perspective: “Avoidance of sun exposure is a risk factor of a similar magnitude as smoking, in terms of life expectancy.”


The idea that slavish application of SPF 50 might be as bad for you as Marlboro 100s generated a flurry of short news items, but the idea was so weird that it didn’t break through the deadly-sun paradigm. Some doctors, in fact, found it quite dangerous.

“I don’t argue with their data,” says David Fisher, chair of the dermatology department at Massachusetts General Hospital. “But I do disagree with the implications.” The risks of skin cancer, he believes, far outweigh the benefits of sun exposure. “Somebody might take these conclusions to mean that the skin-cancer risk is worth it to lower all-cause mortality or to get a benefit in blood pressure,” he says. “I strongly disagree with that." It is not worth it, he says, unless all other options for lowering blood pressure are exhausted. Instead he recommends vitamin D pills and hypertension drugs as safer approaches.

Weller’s largest study yet is due to be published later in 2019. For three years, his team tracked the blood pressure of 340,000 people in 2,000 spots around the U.S., adjusting for variables such as age and skin type. The results clearly showed that the reason people in sunnier climes have lower blood pressure is as simple as light hitting skin.

When I spoke with Weller, I made the mistake of characterizing this notion as counterintuitive. “It’s entirely intuitive,” he responded. “Homo sapiens have been around for 200,000 years. Until the industrial revolution, we lived outside. How did we get through the Neolithic Era without sunscreen? Actually, perfectly well. What’s counterintuitive is that dermatologists run around saying, ‘Don’t go outside, you might die.’”

When you spend much of your day treating patients with terrible melanomas, it’s natural to focus on preventing them, but you need to keep the big picture in mind. Orthopedic surgeons, after all, don’t advise their patients to avoid exercise in order to reduce the risk of knee injuries.

Meanwhile, that big picture just keeps getting more interesting. Vitamin D now looks like the tip of the solar iceberg. Sunlight triggers the release of a number of other important compounds in the body, not only nitric oxide but also serotonin and endorphins. It reduces the risk of prostate, breast, colorectal, and pancreatic cancers. It improves circadian rhythms. It reduces inflammation and dampens autoimmune responses. It improves virtually every mental condition you can think of. And it’s free.

These seem like benefits everyone should be able to take advantage of. But not all people process sunlight the same way. And the current U.S. sun-exposure guidelines were written for the whitest people on earth.


Every year, Richard Weller spends time working in a skin hospital in Addis Ababa, Ethiopia. Not only is Addis Ababa near the equator, it also sits above 7,500 feet, so it receives massive UV radiation. Despite that, says Weller, “I have not seen a skin cancer. And yet Africans in Britain and America are told to avoid the sun.”

All early humans evolved outdoors beneath a tropical sun. Like air, water, and food, sunlight was one of our key inputs. Humans also evolved a way to protect our skin from receiving too much radiation—melanin, a natural sunscreen. Our dark-skinned African ancestors produced so much melanin that they never had to worry about the sun.

As humans migrated farther from the tropics and faced months of light shortages each winter, they evolved to produce less melanin when the sun was weak, absorbing all the sun they could possibly get. They also began producing much more of a protein that stores vitamin D for later use. In spring, as the sun strengthened, they’d gradually build up a sun-blocking tan. Sunburn was probably a rarity until modern times, when we began spending most of our time indoors. Suddenly, pasty office workers were hitting the beach in summer and getting zapped. That’s a recipe for melanoma.

People of color rarely get melanoma. The rate is 26 per 100,000 in Caucasians, 5 per 100,000 in Hispanics, and 1 per 100,000 in African Americans. On the rare occasion when African Americans do get melanoma, it’s particularly lethal—but it’s mostly a kind that occurs on the palms, soles, or under the nails and is not caused by sun exposure.

At the same time, African Americans suffer high rates of diabetes, heart disease, stroke, internal cancers, and other diseases that seem to improve in the presence of sunlight, of which they may well not be getting enough. Because of their genetically higher levels of melanin, they require more sun exposure to produce compounds like vitamin D, and they are less able to store that vitamin for darker days. They have much to gain from the sun and little to fear.

And yet they are being told a very different story, misled into believing that sunscreen can prevent their melanomas, which Weller finds exasperating. “The cosmetic industry is now trying to push sunscreen at dark-skinned people,” he says. “At dermatology meetings, you get people standing up and saying, ‘We have to adapt products for this market.’ Well, no we don’t. This is a marketing ploy.”

When I asked the American Academy of Dermatology for clarification on its position on dark-skinned people and the sun, it pointed me back to the official line on its website: “The American Academy of Dermatology recommends that all people, regardless of skin color, protect themselves from the sun’s harmful ultraviolet rays by seeking shade, wearing protective clothing, and using a broad-spectrum, water-resistant sunscreen with an SPF of 30 or higher.”

This seemed to me a little boilerplate, and I wondered whether the official guidelines hadn’t yet caught up to current thinking. So I asked David Leffell, at Yale. “I think that sun-protection advice,” he told me, “has always been directed at those most at risk”—people with fair skin or a family history of skin cancer. “While it is true that people with olive skin are at less risk, we do see an increasing number of people with that type of skin getting skin cancer. But skin cancer… is very rare in African Americans… and although they represent a spectrum of pigmentation, [they] are not at as much risk.”

Still, David Fisher at Mass General didn't think that changed the equation. “There’s a pharmacopoeia of drugs that are extremely effective at lowering blood pressure,” he said. “So to draw the conclusion that people should expose themselves to an elevated skin-cancer risk, including potentially fatal cancer, when there are so many alternative treatments for hypertension, is problematic.”


Am I willing to entertain the notion that current guidelines are inadvertently advocating a lifestyle that is killing us?

I am, because it’s happened before.

In the 1970s, as nutritionists began to see signs that people whose diets were high in saturated fat and cholesterol also had high rates of cardiovascular disease, they told us to avoid butter and choose margarine, which is made by bubbling hydrogen gas through vegetable oils to turn them into solid trans fats.

From its inception in the mid-1800s, margarine had always been considered creepers, a freakish substitute for people who couldn’t afford real butter. By the late 1800s, several midwestern dairy states had banned it outright, while others, including Vermont and New Hampshire, passed laws requiring that it be dyed pink so it could never pass itself off as butter. Yet somehow margarine became the thing we spread on toast for decades, a reminder that even the weirdest product can become mainstream with enough industry muscle.

Eventually, better science revealed that the trans fats created by the hydrogenation process were far worse for our arteries than the natural fats in butter. In 1994, Harvard researchers estimated that 30,000 people per year were dying unnecessarily thanks to trans fats. Yet they weren’t banned in the U.S. until 2015.

Might the same dynamic be playing out with sunscreen, which was also remarkably sketchy in its early days? One of the first sunscreens, Red Vet Pet (for Red Veterinary Petrolatum) was a thick red petroleum jelly invented in 1944 to protect soldiers in the South Pacific; it must have been eerily reminiscent of pink margarine. Only after Coppertone bought the rights and reformulated Red Vet Pet to suit the needs of the new midcentury tanning culture did sunscreen take off.

However, like margarine, early sunscreen formulations were disastrous, shielding users from the UVB rays that cause sunburn but not the UVA rays that cause skin cancer. Even today, SPF ratings refer only to UVB rays, so many users may be absorbing far more UVA radiation than they realize. Meanwhile, many common sunscreen ingredients have been found to be hormone disruptors that can be detected in users’ blood and breast milk. The worst offender, oxybenzone, also mutates the DNA of corals and is believed to be killing coral reefs. Hawaii and the western Pacific nation of Palau have already banned it, to take effect in 2021 and 2020 respectively, and other governments are expected to follow.

The industry is now scrambling to move away from oxybenzone, embracing opaque, even neon, mineral-based formulations, a fashion statement reminiscent of the old Red Vet Pet. But with its long track record of pushing products that later turn out to be unhealthy, I remain skeptical of industry assurances that it finally has everything figured out. We are always being told to replace something natural with some artificial pill or product that is going to improve our health, and it almost always turns out to be a mistake because we didn’t know enough. Multivitamins can’t replace fruits and vegetables, and vitamin D supplements are clearly no substitute for natural sunlight.


Old beliefs don’t die easily, and I can understand if you remain skeptical of old Sol. Why trust one journalist and a handful of rogue researchers against the august opinions of so many professionals?

Here’s why: many experts in the rest of the world have already come around to the benefits of sunlight. Sunny Australia changed its tune back in 2005. Cancer Council Australia’s official-position paper (endorsed  by the Australasian College of Dermatologists) states, “Ultraviolet radiation from the sun has both beneficial and harmful effects on human health…. A balance is required between excessive sun exposure which increases the risk of skin cancer and enough sun exposure to maintain adequate vitamin D levels…. It should be noted that the benefits of sun exposure may extend beyond the production of vitamin D. Other possible beneficial effects of sun exposure… include reduction in blood pressure, suppression of autoimmune disease, and improvements in mood.”

Australia’s official advice? When the UV index is below 3 (which is true for most of the continental U.S. in the winter), “Sun protection is not recommended unless near snow or other reflective surfaces. To support vitamin D production, spend some time outdoors in the middle of the day with some skin uncovered.” Even in high summer, Australia recommends a few minutes of sun a day.

New Zealand signed on to similar recommendations, and the British Association of Dermatologists went even further in a statement, directly contradicting the position of its American counterpart: “Enjoying the sun safely, while taking care not to burn, can help to provide the benefits of vitamin D without unduly raising the risk of skin cancer.”

Leffell, the Yale dermatologist, recommends what he calls a “sensible” approach. “I have always advised my patients that they don’t need to crawl under a rock but should use common sense and be conscious of cumulative sun exposure and sunburns in particular,” he told me.

This does not mean breaking out the baby oil or cultivating a burnished tan. All the experts agree that sunburns—especially those suffered during childhood and adolescence—are particularly bad.

Ultimately, it’s your call. Each person’s needs vary so much with season, latitude, skin color, personal history, philosophy, and so much else that it’s impossible to provide a one-size-fits-all recommendation. The Dminder app, which uses factors such as age, weight, and amount of exposed skin to track the amount of sunlight you need for vitamin D production, might be one place to start. Trading your sunscreen for a shirt and a broad-brimmed hat is another. Both have superior safety records.

As for me, I’ve made my choice. A world of healthy outdoor adventure beckons—if not half naked, then reasonably close. Starting today, I’m stepping into the light.

Leave a Reply

Your email address will not be published. Required fields are marked *