Sunday, September 26 2021
Clear Sky
26°C New York

1 € = $ 1.1737
1 $ = € 0.8520

Science's Breakthrough of the Year 2020

In March, when cases of COVID-19 began to overwhelm hospitals in the United States, I told my 90-year-old mother that she had to shelter in place. She lives alone in Los Angeles, and to keep her company, I FaceTimed her every night. In the role reversal that happens with time, I became the forever-worried, nagging parent, and she was the ever-doubting, defiant child.

Over my increasingly loud objections, she’d gone to the mall with her sister, had her nails done, and lost 56 cents playing mahjong with “the girls.” The world she knew was dying, and after a few weeks of denial, bargaining, and anger, she finally entered the grief stages of depression and acceptance and quarantined herself.

My mother’s loneliness, fear, and boredom sometimes make me feel like our chats are jailhouse visits on phones separated by a glass wall. “I didn’t leave my house today—again,” she says, day after day, as though it were my fault. “Same everything. Same nothing.” And she plaintively asks the same question.

“When is this going to end?”

FOR MY MOTHER and countless others, life was put on hold this year. But in biomedicine, progress has been astonishingly fast. Just weeks ago it culminated in what the world needs to answer my mother’s question: safe, effective vaccines against COVID-19.

On 31 December 2019, health officials in Wuhan, China, reported a mysterious cluster of pneumonia cases that had sickened 27 people. By 8 January, The Wall Street Journal revealed that Chinese researchers had linked the disease to a novel coronavirus; 2 days later, scientists posted online the genetic sequence of what is now known as SARS-CoV-2. Within hours, the search for a COVID-19 vaccine began.

That first month, confusion reigned. No one knew how deadly SARS-CoV-2 was or how it might threaten global health. China obscured early evidence of human-to-human transmission, and the seemingly limited spread to other countries delayed the World Health Organization from declaring an international health emergency. But as January ended, the global threat had become clear.

By February, several companies had launched aggressive COVID-19 vaccine projects. In China, CanSino Biologics, Sinovac Biotech, and state-owned Sinopharm were first out of the gate. In the United States, the front-runners were Moderna and Inovio Pharmaceuticals. In Europe, BioNTech, a German biotechnology company, developed a candidate it would later share with pharmaceutical giant Pfizer. At the University of Oxford, an academic group created a vaccine that eventually attracted another Big Pharma partner, AstraZeneca. Janssen and Sanofi Pasteur also joined the race.

Two of the Chinese contenders made vaccine candidates with the entire virus, whereas every other effort singled out the SARS-CoV-2 surface protein, spike, which structural biologists were quick to map and study. Spike initiates infection by attaching to receptors that stud human cells. A vaccine might “neutralize” the virus if it could train the body to create antibodies that would glom onto spike at the precise spot where it engages with its receptor.

Developers tapped into a dazzling array of technologies to make an effective vaccine. Moderna and the Pfizer-BioNTech collaboration banked on a strategy that had never before brought any medicine to market: labmade messenger RNA (mRNA). They designed snippets of the genetic code for the spike protein and swathed them in a jacket of fats so they could slip into humans cells, which would then make the viral protein.

Inovio opted instead for DNA encoding spike. Still others, including CanSino, Oxford, and Janssen, jiggered harmless viral vectors—most often the cold-causing adenoviruses—to shuttle the spike gene into the body’s cells. Sanofi Pasteur, Novavax, and Clover Biopharmaceuticals genetically engineered spike in cell cultures so their vaccines could present the protein itself.

But making a vaccine isn’t just a matter of choosing a technology. It has to be tested, first for safety and then for efficacy, in thousands of people who receive the shot or a placebo and are monitored to see who gets sick. “You’re not just going to pull a vaccine out of your pocket,” said Anthony Fauci, head of the U.S. National Institute of Allergy and Infectious Diseases, on 11 February. Fauci, who said it typically took “6, 7, 8 years” to develop a vaccine, predicted that small clinical trials would begin in March, but larger trials not until June. In the best-case scenario, he said, “It would take at least 6 or 8 months to know if it works.”

But the best-case scenario was even better than Fauci expected.

The field received a jolt of good news in April, when Sinovac showed for the first time that a COVID-19 vaccine safely protected monkeys from an intentional “challenge” with the virus. The company used an old, and, some thought, outmoded technology: whole, killed virus. But the concept itself now had proof. A flood of other monkey challenge successes followed.

By 20 April, the day after the first report of a monkey success, five companies had vaccines in clinical trials, and no fewer than 71 other candidates were in preclinical development. By the end of the month, U.S. President Donald Trump was touting a project called Operation Warp Speed, which he promised would invest billions in COVID-19 vaccine R&D. “We’re going to fast-track it like you’ve never seen before,” said Trump, whose administration would eventually pump about $11 billion into the program. That proved an unusually truthful claim from the reality-bending president.

The race took several surprising turns in July. Because China had so successfully stopped the spread of the virus, its vaccine candidates had to be tested abroad, slowing their advance. On 27 July, the Moderna and Pfizer-BioNTech candidates both entered efficacy trials that quickly enrolled more participants in hard-hit locales than the Chinese vaccine studies. Those mRNA vaccines became the first to cross the finish line, each reporting roughly 95% efficacy in November.

NINETY-FIVE PERCENT. That’s higher than almost anyone dared hope for. (Influenza vaccines, in a good year, hit 60% effectiveness.) A confluence of forces propelled science from zero to a COVID-19 vaccine at revolutionary speed. Never before have researchers so quickly developed so many experimental vaccines against the same foe. Never before have so many competitors collaborated so openly and frequently. Never before have so many candidates advanced to large-scale efficacy trials virtually in parallel. And never before have governments, industry, academia, and nonprofits thrown so much money, muscle, and brains at the same infectious disease in such short order.

Biology, however, may have been the overriding factor in the success of COVID-19 vaccines. In 1990, I set out to write a book that chronicled 1 year in the search for a vaccine that could thwart the AIDS virus. Eleven years later, I published Shots in the Dark: The Wayward Search for an AIDS Vaccine. After recounting one failure after another, I proposed a Warp Speed–like program. But even such a crash effort might not have delivered an AIDS vaccine. HIV, which slowly destroys the T cells that coordinate an immune response, also outwits the body’s other immune warriors.

SARS-CoV-2 is different. Early in the pandemic it became clear that most people developed only mild disease, suggesting the immune system can hold the virus in check—and that vaccine-stimulated immunity might prove an effective defense. HIV or hepatitis C, in contrast, cause lifelong infections.

If SARS-CoV-2 is an easy mark, the mRNA vaccines that delivered the most spectacular early results may soon be joined by many others. One, from Russia’s Gamaleya Research Institute of Epidemiology and Microbiology, reported efficacy results that nearly match those of the mRNA vaccines. Then there are promising, if still confusing, data from China’s Sinopharm and on the AstraZeneca-Oxford candidate, which alone could supply 3 billion doses—more than both mRNA vaccines combined.

As of 10 December, 162 candidates were in development and 52 were already in clinical trials. If even a fraction of those work, different countries may get to choose the vaccines that best fit their budgets and delivery capabilities—and separate vaccines could be available for children, pregnant women, young adults, and the elderly.

To be sure, the clinical trial results reported to date have mainly come from glowing company press releases, not the full presentations of data that could reveal caveats. Vaccine doses will be scarce for even the wealthiest countries until at least spring, and the world’s poor will surely wait longer, despite the creation of a global alliance, the COVID-19 Vaccines Global Access Facility, to increase access.

In other ways, too, the pandemic-battered world has a long trip ahead on a steep mountain road with no guardrails. Vaccine hesitancy, manufacturing problems, and breakdowns in supply chains could botch ambitious rollouts. SARS-CoV-2 might mutate to evade protective immune responses. Vaccines might prevent disease, but not transmission, delaying the end of the pandemic. Worst of all, rare, serious side effects could surface when vaccines move from efficacy trials to entire populations.

STILL, WHEN I FOUND OUT on 15 November—1 day before the news went public—that early data from Moderna matched the hard-to-believe preliminary report from Pfizer and BioNTech that came out the week before, my optimism overflowed, for the first time since the pandemic began. I swore my mother to secrecy and shared the good news.

Over the past few weeks, several countries, including the United States, have granted emergency use authorization to the Pfizer-BioNTech vaccine. More will follow. Moderna’s candidate looks likely to pass regulatory muster over the next few weeks, too.

What a joyous way to end this year. I can stop worrying about my mom dying alone in an intensive care unit, away from all who love her. And she can stop asking whether I’ll let her play mahjong with the girls. I imagine the news spreading between other children and their aging parents, in the break rooms of nursing homes, in hospital hallways, and among the staff who run schools, grocery stores, restaurants, and places of worship.

Normal won’t return for a long time. But in the coming months, as vaccines are rolled out and a fuller picture of their promise emerges, we may finally be able to answer the question, “When is this going to end?”


A divisive disease

As scientists struggled to understand and quell COVID-19, a second pandemic of misinformation and political mayhem raged


Angela Rasmussen is as outspoken a scientist as you are likely to find. And this year she spoke out a lot. One of many researchers who became celebrities during the COVID-19 pandemic, the virologist at Georgetown University was quoted in hundreds of articles, appeared as an expert on TV and radio, and took to Twitter to put news about mutations or reinfections into context—and to call radiologist and top U.S. government adviser Scott Atlas a “gaslighting motherf---er.”

But Rasmussen’s messages did not resonate with everyone—even in her own family. Split along political fault lines in the Trump era, some of her relatives no longer speak to one another, she says. When one of her aunts ended up in intensive care with COVID-19 in the summer, Rasmussen only found out because a cousin texted her, worried that others in the aunt’s household did not feel the need to quarantine and get tested. “Guess what: They all had COVID,” Rasmussen says. “But my own family, because of politics, did not reach out to the COVID expert in the family.”

A similar dynamic played out in myriad variations across the United States and the globe as the coronavirus pandemic unfolded. Researchers worked fast and furiously and achieved breakthroughs big and small. But for science’s relations with the wider world, this year marked a breakdown: in communication, in trust, in the sense of a shared reality.

The pandemic was the type of threat researchers had worried and warned about for years: a deadly animal virus, new to humans, and spread in the breath we exhale. “If you had asked me 5 years ago what would keep me most awake at night, this almost defines it perfectly,” says Jeremy Farrar, who heads the Wellcome Trust.

And this virus had help. A “syndemic” is the intersection of two epidemics—two diseases ravaging a population at the same time, exacerbating each other. HIV weakens the immune system, for instance, which makes people more likely to develop tuberculosis. The world witnessed something similar this year. We live in an ecosystem that allows viruses to cross from wildlife to humans more often and spread farther and faster than ever before—that gave us SARS-CoV-2. But the virus emerged in an information ecosystem that helps misinformation and lies spread faster than scientific evidence, weakening our ability to respond to new threats. That made the pandemic far worse.

NEWS OF A CLUSTER of pneumonia cases in China emerged on the eve of the new year. Ten days later, a full genome sequence of the virus was posted online. A diagnostic test was ready a few days later. A team of experts sent to China by the World Health Organization (WHO) in February came home with a surprising finding: China had done the impossible—halting the outbreak of a respiratory pathogen—by locking down its citizens. Science was working faster than ever before.

But the virus was faster. Carried around the world by travelers, it spread surreptitiously at first but quickly sickened and killed patients at a rate that threatened to overwhelm health care systems. As scientists, doctors, and nurses worked around the clock, countries on all continents tried to follow the Chinese example, depriving the coronavirus conflagration of the oxygen it needed: human contact.

“Science is our exit strategy,” Farrar told Science in those dark days of the first peak. And in many ways, science delivered. It launched an all-out effort to develop animal models and diagnostics, chart the pathogen’s path of destruction through the human body, find drugs, and develop vaccines. “We took out all our fancy tools and applied them to this virus,” says virologist Florian Krammer of the Icahn School of Medicine at Mount Sinai.

With acceleration came accidents. Preprint servers like bioRxiv and medRxiv became hubs for sharing information quickly, but they occasionally spread misinformation just as fast. Papers suggesting SARS-CoV-2 was an engineered virus, or that it came from snakes, found a platform they did not deserve—and widespread media coverage. Peer-reviewed journals slipped up as well. The Lancet and The New England Journal of Medicine fell for fraudulent papers purportedly containing data from hundreds of hospitals around the world, collected by a tiny company few had ever heard of. Many research results, including the stunning vaccine data of the past few weeks, were communicated directly to journalists, bypassing any scientific scrutiny. “It is a pandemic by press release,” says WHO epidemiologist Maria Van Kerkhove.

Still, for those willing to learn, this year presented an unprecedented opportunity to see science at work—to hear experts explain viruses and vaccines, see them critique each other’s papers on Twitter, and understand how in science, uncertainty and self-correction are strengths rather than flaws. The process of science was rarely as visible as this year. It was like watching open-heart surgery live on TV: messy but vital and riveting.

BUT WHEN IT CAME to countering the other plague, that of disinformation and deception, the toolbox was empty. Just as videoconferencing and online shopping found massive new markets as stores, schools, and offices closed, so polarization, politicization, and a media ecosystem that elevates simple lies over complex truths were ready to take advantage of an unsettled public struggling with uncertainty. Even as hundreds of thousands died, many people downplayed the problem or refused to acknowledge its existence, no matter what the experts said. “It’s a little like watching a zombie movie in which half of the people can’t see the zombies and keep demanding to know what the fuss is about,” says epidemiologist William Hanage of the Harvard T.H. Chan School of Public Health. Politicians and some physicians began to promote drugs without evidence. The White House flouted epidemiologists’ advice about face masks and SARS-CoV-2’s propensity to spread in clusters indoors—and itself became the site of a superspreading event.

Scientists, not the virus, became the enemy for some. Top virologists needed police protection. Many other researchers reported threats and harassment, with women often subjected to the worst of it. “I used to think it only took brains, but now you need to be brave and courageous as well to do science,” Mike Ryan, executive director of WHO’s Health Emergencies Programme, said during a November press conference. No wonder many scientists did not speak out.

Conspiracy theories flourished. People burned down cellphone towers, blaming them for the pandemic. Others tried to film in hospitals they said were empty. It was all planned. It was all fake. Or maybe it was both.

Scientists themselves contributed to the confusion. French microbiologist Didier Raoult touted hydroxychloroquine based on a study with few participants and no real control group. Stanford University statistician John Ioannidis, once described as “the scourge of sloppy science,” was accused of being less than rigorous himself in studies that suggested SARS-CoV-2 was not all that deadly. Three scientists with high-profile affiliations published the Great Barrington Declaration, which advocated for shielding the most vulnerable in society while letting the virus infect everyone else to build up herd immunity, a strategy most epidemiologists considered dangerously misguided.

Such episodes played into the desire for easy solutions: a cure-all pill, a disease that was less dangerous, a quick return to life before the pandemic. Some scientists may have been driven by a healthy distrust of accepted wisdom or a contrarian spirit, but the effect was reminiscent of industry’s playbook in the fights over tobacco and climate change: Create just enough confusion about the evidence to allow people to carry on as before.

Science worked best when many researchers joined hands. Hundreds of small drug studies didn’t result in clear answers, but two big trials—the United Kingdom’s Recovery and WHO’s Solidarity—convincingly relegated hydroxychloroquine and other drugs to the dustbin while showing that dexamethasone, a cheap steroid, cut deaths by one-third. Thousands of scientists signed the John Snow Memorandum, a riposte to the Great Barrington Declaration that declared the herd immunity strategy “a dangerous fallacy.” The vaccines, too, were the product of thousands of scientists and doctors working together.

In the end, science may save the day—we’ll find out in the months and years ahead whether vaccines can defeat the virus. But the pandemic was a stress test for the scientific enterprise. Some cracks that had long been there, small enough to be ignored by many, widened into deep fissures.

FARRAR IS HOPEFUL that humanity will come away wiser after staring into the abyss. “I think we will look back after the horror of this and say, humanity is incredibly vulnerable,” he says. “This will inspire a whole generation to come into science.” Evidence, he says, will carry the day.

But a new crisis is coming that scientists have warned and worried about for years—one that is slower, yet even more menacing, and far easier to ignore or deny. “You know the biggest deal of this year?” Hanage asks. “When it comes to climate change we are totally screwed.”

There will be no easy scientific fix for global warming. And if this pandemic has shown anything, it is that evidence without action is like a vaccine in a freezer: It is all potential. Scientists knew deaths would follow cases as sure as thunder follows lightning. And yet politicians and ordinary citizens alike found it hard to act until morgues were overflowing. Some refused to acknowledge reality even then. How much harder will it be to act on climate change?

The upshot of this year cannot just be more research on unknown pathogens lurking in nature. It has to be an effort to revive and strengthen the bonds between science and the rest of society.

SARS-CoV-2 did not just disrupt the world. It shattered the fragile mirror we thought of as reality. Without it, we will be defenseless in the next crisis.


A year like no other

This year saw an explosion of research into COVID-19—by mid-December, more than 200,000 papers had been published in peer-reviewed journals, and many more had been posted online to non—peer-reviewed preprint servers. Key findings appear below, pegged to the date they first appeared online. At the same time, the world was struggling—and in many places, failing—to contain the pandemic.

31 December 2019

Health officials in Wuhan, China, report a cluster of mysterious pneumonia cases.

8 January

A novel coronavirus is publicly identified as the source of the outbreak.

10 January

Genetic sequence posted to

20 January

Human-to-human transmission confirmed

23 January

Wuhan lockdown begins

23 January

The novel coronavirus is 96.2% similar to a virus in bats.

30 January

WHO declares the outbreak a Public Health Emergency of International Concern.

5 February

Diamond Princess cruise ship begins its 2-week quarantine.

11 February

The virus gets its name.


Age, hypertension, diabetes, heart disease, and male sex are all confirmed as risk factors for severe disease.

17 February

Widespread undocumented infection led to the rapid spread of COVID-19.

19 February

Scientists image the atomic-level structure of the spike protein.

23 February

First major European outbreak in Italy

11 March

WHO declares the coronavirus outbreak a pandemic.

16 March

Moderna and CanSino vaccine trials launch.

16 March

Imperial College London model suggests health care systems will be overwhelmed.

26 March

New York City becomes the epicenter of the U.S. outbreak.

26 March

Widespread viral RNA in hospital rooms suggests airborne transmission.

28 March

U.S. regulators issue emergency use authorization (EUA) for hydroxychloroquine.

Early April

COVID-19 can damage multiple organ systems, including the heart, blood vessels, kidneys, and brain.

3 April

U.S. public health officials suggest people wear cloth masks.

19 April

First evidence that a COVID-19 vaccine can protect monkeys

24 April

Asymptomatic carriers play a large role in COVID-19 transmission.

27 April

Remdesivir reduces hospital stays in severely ill patients.

1 May

U.S. regulators issue EUA for remdesivir.

15 May

Operation Warp Speed vaccine project launches.

4 June

Major medical journals retract coronavirus papers over fabricated data.

5 June

Hydroxychloroquine does not reduce death in hospitalized patients.

15 June

U.S. regulators revoke EUA for hydroxychloroquine.

16 June

Dexamethasone reduces fatality by up to one-third in hospitalized patients.

6 July

United States sends a formal notice of withdrawal from WHO.

9 July

Early studies hint at the lingering symptoms of “long COVID.”

25 August

First confirmed case of reinfection

9 September

Two-thirds of countries have joined COVAX, an international vaccine distribution effort.

24 September

Up to 14% of severe cases are linked to genetic factors.

Late October

Europe and the United States confront their second and third waves.

9 November

Pfizer and BioNTech announce vaccine efficacy above 90%.

16 November

Moderna reveals vaccine efficacy nearing 95%.

23 November

AstraZeneca reports vaccine efficacy between 62% and 90%.

24 November

Russia’s Gamaleya institute reports vaccine efficacy of 91.4%.

2 December

U.K. regulators are the first to authorize the Pfizer-BioNTech COVID-19 vaccine.

9 December

China National Biotec Group vaccine has 86% efficacy, according to UAE health ministry.

14 December

U.S. health care workers begin to receive Pfizer vaccine.



Ones we’ve lost


COVID-19 has made 2020 a cruel year for us all. As Science went to press, the global toll of the pandemic had already exceeded 1.6 million, a tragic number that includes scientists of all specialties, ages, and backgrounds. Behind the mind-numbing total are individuals, each a spark of ingenuity, imagination, and creative spirit. Because we can’t do justice to every life lost, we’ve chosen only a few. In remembering them, we mourn the much larger losses for the scientific community—and the world.

Li Wenliang


Li Wenliang did not set out to be a hero. On 30 December 2019, the 33-year-old Wuhan Central Hospital ophthalmologist warned a small group of colleagues that cases of a severe acute respiratory syndrome–like illness had been confirmed in area hospitals. “Don’t spread the word, let your family members take precautions,” he wrote in a brief message.

Someone did spread the word, which went viral. Four days later, Li was called to a meeting with local police, who forced him to confess to spreading rumors. When the young doctor fell ill with COVID-19 less than 1 week later, he took to the microblogging website Sina Weibo to tell his story.

Citizens were outraged at the officials’ tactics. As Li’s condition deteriorated in the early hours of 7 February, millions followed media updates on his condition. When his hospital confirmed his death shortly before 3 a.m., thousands of locked-down Wuhan citizens came to their high-rise windows, calling his name and grieving. He became “the face of COVID-19 in China,” independent social media expert Manya Koetse wrote on her What’s on Weibo website.

The outrage forced an investigation, and in March, officials formally exonerated Li and apologized to his family.

Although his death shook a nation, Li was a modest man dedicated to his work. In one social media post, he apologized to his patients for being irritable, then added that he enjoyed his fried chicken dinner after “thinking about it all day.”

Ten months after his death, more than 1.5 million people still follow Li’s Weibo page. His final post, in which he finally reported testing positive for the COVID-19 virus weeks after infection, has drawn more than 1 million comments, with dozens more posted every day. Writers address Li as if he’s an old friend, sharing their daily troubles and future hopes, calling him an inspiration, and sending birthday wishes. Many sent congratulations when his wife gave birth to their second son, about 4 months after Li died.

In early February, Li was “a symbol of public anger against the failure of the Chinese system to address the COVID-19 pandemic,” says Steve Tsang, a political scientist who focuses on China at the SOAS University of London. As China brought the outbreak under control, that anger has largely dissipated. But in the eyes of the public, Li remains the hero he never set out to be.

Gita Ramjee


After finishing her Ph.D., epidemiologist Gita Ramjee made a decision that would change the course of her life—and many others. She paused her work on childhood kidney disease to explore whether vaginal microbicides could protect South African women from contracting HIV. It was 1994, and the world was at the height of the AIDS crisis: Few treatments were available, and no end was in sight. And women, especially sex workers, were being hit increasingly hard.

The largely overlooked plight of these women “sparked her passion,” says Gavin Churchyard, CEO of the Aurum Institute, the HIV and tuberculosis prevention nonprofit where Ramjee was chief scientific officer. “She wouldn’t just sit back and allow things to happen,” Churchyard says. “She would make them happen.”

Ramjee, a fierce advocate for women’s health, devoted the rest of her life to searching for ways to prevent HIV infection and providing them to the communities that needed them most. She held herself and colleagues to high standards, Churchyard says, pushing for excellence in an area of research that often had disappointing results. “She was a persevering and dedicated person,” says social scientist Neetha Morar, whom Ramjee mentored at the South African Medical Research Council (SAMRC). “Every time a negative result came through, she would get up and continue on.”

Ramjee’s commitment to work was matched only by her devotion to her family, Morar says. When her two sons still lived at home, she would prepare a full meal—with handmade bread—every day before work, to make sure her family ate dinner together in the evening. After her sons left home, Morar says, she kept up the ritual with her husband. She was ecstatic at the birth of her first grandchild and often brought pictures to the office to show colleagues, Churchyard remembers.

Ramjee died on 31 March at age 63. Even months after her passing, her life’s labor is still bearing fruit, says Wafaa El-Sadr, an epidemiologist at Columbia University and longtime collaborator. While at SAMRC, Ramjee oversaw on-site trials for a long-acting antiviral injection recently found to be more effective than a daily pill at preventing HIV in women. “She would have been thrilled,” El-Sadr says. “It’s very bittersweet to have this amazing victory and she’s not around to celebrate.”

Lynika Strozier


Lynika Strozier lay in a hospital bed dying of COVID-19 as Black Lives Matter protesters took to the streets of Chicago in June. The 35-year-old geneticist was a gifted laboratory scientist, a passionate teacher, and a mentor to scores of students, many from underrepresented backgrounds. “Science was her baby,” says her grandmother, Sharon Wright, who raised Strozier from birth.

Her path wasn’t easy. Strozier was diagnosed early in life with a learning disability, and she had to study harder than her peers, Wright says. But she was a natural when it came to lab work, discovering her talent in college when she landed an internship taking care of cell lines at Truman College. “Most of us would have given up—and she always persevered,” says Matt von Konrat, a botanist at the Field Museum who watched Strozier move from intern to research assistant to collections associate at the museum’s Pritzker DNA Laboratory, where she studied evolution in liverworts, birds, and other organisms.

By 2018, Strozier had completed two master’s degrees, one in biology and one in science education, before starting a job teaching ecology and evolution in January at Malcolm X College. “We had hoped that that would be just the beginning of her success story,” says Sushma Reddy, an evolutionary biologist at the University of Minnesota, Twin Cities, who was Strozier’s graduate adviser at Loyola University Chicago. She was an inspiring teacher, Reddy says, and she was also someone her students could aspire to be.

“She was literally the first Black scientist I ever met,” says Heaven Wade, a biochemistry undergraduate at Denison University whom Strozier mentored during an internship at the Field Museum. “We all loved her.” Wade, who is also Black, credits Strozier with keeping her in science: When she considered switching her major because she wasn’t feeling “very welcome” in her program, Strozier persuaded her to stay. “She was so encouraging … it really inspired me to keep going.”

Even now, Strozier continues to inspire young scientists. Her colleagues came up with the idea of creating an internship in her name, to help women of color gain research experience at the Field Museum. The fund is halfway to its $100,000 goal. “That’s what Lynika would have wanted,” Reddy says

John Houghton


John Houghton loved a good country walk. So when the British climate scientist, instrumental to sounding the global alarm on climate change, found himself with a free afternoon at the National Center for Atmospheric Research’s Mesa Laboratory near Boulder, Colorado, he headed straight out the back door—and into the Rocky Mountains. He even convinced a handful of fellow visitors, in inappropriate shoes, to join him as the sunlight waned.

That spirit of exploration was fundamental to Houghton, who began his career in the 1960s developing space-based sensors that used the radiation emitted by carbon dioxide to take the atmosphere’s temperature. Those measurements soon helped make clear that the burning of fossil fuels could, in a few generations, deeply alter the planet. In time, Houghton found himself in a position to make a difference, directing the United Kingdom’s Met Office and helping lead the first three reports from the United Nations’s Intergovernmental Panel on Climate Change (IPCC).

Houghton was widely regarded as brilliant, but it was his emotional intelligence that made him so effective, says Robert Watson, a former IPCC chairman. “He showed respect for people,” Watson says. During the summit of the third IPCC assessment, published in 2001, government representatives spent the entire first day squabbling over a sentence that explained who was preparing the report. Fellow panelist David Griggs despaired of getting more controversial language approved.

“Everyone wants to hear their own voice,” Houghton told him. “If I allow them to take control now, they’ll allow me more flexibility later.” And sure enough, by the third day, the representatives were willing to include a sentence that is now seen as a turning point in climate science: “Most of the warming observed over the last 50 years is attributable to human activities.”

Like many who led the charge on climate change, Houghton, who died in April at age 88, did not live to see the world mount a credible response. But he never lost faith in humanity, Griggs says. “He always felt, in the end, people would respond and act on climate change.” That optimism may have stemmed in part from Houghton’s deep Christian faith, which led him to engage with climate change skeptics—and sometimes convince them, Watson says.

The hikers who set off from the Mesa Lab that afternoon never made it to the summit, says Griggs, who was among them. A pitch-black night fell, and they were ready to bed down outside—but Houghton believed they’d find the road back.

They did.

Lungile Pepeta


When the breadwinner of a Xhosa family dies, mourners say umthi omkhulu uwile, a mighty tree has fallen. That’s what family, friends, and colleagues felt when Lungile Pepeta, a leading South African pediatric cardiologist and dean of health sciences at Nelson Mandela University, lost his life to COVID-19 at age 46, says Samkelo Jiyana, a pediatric cardiologist who trained under Pepeta, a tireless champion of rural and child health care.

“He was an incredible person,” says pediatric cardiologist Adèle Greyling, who also trained under Pepeta. “It was a devastating loss for us all.”

Pepeta, who grew up in Eastern Cape province, never forgot his roots. After his training in Johannesburg, he returned to the Eastern Cape, where he founded the poverty-stricken province’s first pediatric cardiology unit and began to train others to follow in his footsteps. Before his arrival, children with serious heart conditions were forced to travel hundreds of kilometers—often by bus or even hitchhiking—to medical centers in major cities.

Pepeta’s research at Nelson Mandela University, in the Eastern Cape, focused on congenital heart conditions and rheumatic heart disease, which often arises from untreated streptococcal throat infections. It is a “disease of the poor,” says Jiyana, who now works at Netcare Greenacres Hospital in Port Elizabeth, South Africa. But when the pandemic reached the Eastern Cape, Pepeta launched a public information battle through social media and TV interviews in which he urged social distancing and the isolation of anyone who might be infected. He also called for coordination between the region’s public and private health care systems and advised the provincial government on its pandemic response.

Pepeta did not live to see the achievement of one of his most ambitious dreams: the opening of South Africa’s 10th medical school, at his university. He deliberately located the school on the Missionvale campus—once an apartheid-era university built for Black people—to fulfill its mission of delivering “proper healthcare for all our communities,” he wrote last year.

On his birthday on 16 July, Pepeta was in the hospital with COVID-19 symptoms when he received news that the medical school’s accreditation application had been approved. Soon after, when he was already on high-flow oxygen and within days of being admitted to the intensive care unit, he submitted his final paper to a medical journal. He died on 7 August. “He did the work of two or three other people in his lifetime,” Greyling says. “I don’t think we’ll ever meet anyone like him again.”

Maria de Sousa


When Portuguese immunologist Maria de Sousa was teaching at the University of Porto in the 1990s, she would take her students to the city’s famous art museum, in a former 18th century palace. She would tell them to describe a painting, then take a second look. “She wanted to teach people how to see, because people miss what’s there,” says Rui Costa, a neuroscientist at Columbia University and former student.

De Sousa herself looked beyond the obvious in a career that took her to top research centers in the United Kingdom and New York City, then back to her home country. Her discoveries, and her tireless devotion to Portuguese science, earned her the status of a revered hero in the research community. She died in Lisbon, Portugal, on 14 April at age 80.

De Sousa’s work in immunology began in the 1960s, when a dictator ruled Portugal and most young women had no choice but to become homemakers. After earning a medical degree, de Sousa left at age 25 for graduate studies in London and Glasgow, U.K. There, she examined mice from which the thymus—an organ whose role in the immune system was just coming to light—had been removed soon after birth. A whole class of immune cells produced by the thymus was missing from the animals’ lymph nodes. She realized that the cells, now called T cells, must migrate from the thymus to specific areas in the lymph nodes, where they stand ready to fight pathogenic invaders. The discovery soon became part of standard immunology textbooks.

De Sousa moved to New York City in 1975 and later established a cell ecology lab at what is now Memorial Sloan Kettering Cancer Center. But she was drawn back to Portugal in 1984 to study hemochromatosis, an inherited disease common in the northern part of the country that causes a harmful overload of iron in the blood.

De Sousa also had a second mission: to bring scientific rigor to Portugal’s then-weak research institutions. She worked with the country’s science minister to establish outside reviews of university research programs. And de Sousa pushed for Portugal’s first graduate programs in biomedical science, including a highly regarded Ph.D. program that she led at the University of Porto. “She spearheaded a revolution in Portuguese science,” Costa says.

De Sousa was not only a creative scientist and demanding mentor; she was also a poet, pianist, and art lover. “She was the quintessential intellectual,” Costa says. After her death, Portugal’s president, Marcelo Rebelo de Sousa, remembered her as “an unmatched figure in Portuguese science.”

Mishik Kazaryan and Arpik Asratyan


In 1980, at the tender age of 32, experimental physicist Mishik Kazaryan won the Soviet Union’s top science prize for his pioneering work on metal vapor lasers. At the same time, his wife—epidemiologist Arpik Asratyan—was making her own mark as a scientist, crisscrossing the vast nation and probing disease outbreaks. The high-achieving couple, who mentored scores of scientists, persevered through the Soviet collapse and the subsequent privations visited on Russian research. But within days of celebrating their 45th wedding anniversary, they succumbed to COVID-19: Asratyan first, on 27 March, and Kazaryan 10 days later.

The couple ran a science-first household: Their daughter, Serine Kazaryan, is a gynecologist with the Global Medical System Clinic in Moscow, and their son, Airazat Kazaryan, is a gastrointestinal surgeon at the Østfold Hospital Trust in Grålum, Norway. Talk at the dinner table often revolved around research, and daughter, father, and mother published several papers together.

Mishik Kazaryan, born in Armenia, spent his entire working life at one of Russia’s scientific powerhouses, the P. N. Lebedev Physical Institute. His research spanned areas including high-power tunable lasers, laser isotope separation, and laser medicine; he collaborated with Alexander Prokhorov, who shared the 1964 Nobel Prize in Physics for the invention of the laser. Mishik Kazaryan’s “major contribution,” Serine Kazaryan says, was a self-heating copper vapor laser—the brightest pulsed visible-light laser—that found wide use in the precision machining of semiconductors and other materials.

Asratyan, also born in Armenia, first studied Mycoplasma hominis, a then–little-known bacterium linked to pelvic inflammatory disease, vaginosis, and respiratory ailments. She became a leading figure in the diagnosis of hepatitis B and C at the Gamaleya Research Institute of Epidemiology and Microbiology, and she spent much of her career working with vulnerable individuals: drug addicts and those with psychiatric afflictions or HIV.

“I don’t remember my parents to complain of anything,” says Serine Kazaryan, who lived with her son, daughter, and parents in an apartment in Moscow. They all took ill in mid-March. Serine Kazaryan and her children recovered. Her parents did not.

Right up until his last days, Mishik Kazaryan was wrapping up a book about the laser cutting of glass. It was “very touching,” Serine Kazaryan says, when an old friend and co-author, Valery Revenko of the JSC Scientific Research Institute of Technical Glass, vowed to complete it.

Ricardo Valderrama Fernández


Peruvian scientist and politician Ricardo Valderrama Fernández was first in many things. In the 1970s, he was among the first anthropologists to make contact with the Kugapakori, an Indigenous group living in the Peruvian Amazon. He co-founded the first research institute for Andean studies in Cusco in 1974. And in 1977, he wrote a “revolutionary” work on Indigenous, Quechua-speaking laborers, in which—breaking with anthropological traditions of the time—their testimony took center stage.

The book, one of the first works on contemporary Andean culture, “broke the barrier” between anthropology and politics, says César Aguilar León, an anthropologist at the National University of San Marcos. Gregorio Condori Mamani: An Autobiography documented the poverty, discrimination, and mistreatment faced by those left behind in a society grappling with the legacy of Spanish colonialism.

“We wanted to be the voice of those who are not heard, to write the words of those who cannot read and write,” says Valderrama Fernández’s co-author and wife, anthropologist Carmen Escalante Gutiérrez.

The couple always worked together and published four more books and dozens of articles on the legends and customs of the Andean people. They immersed themselves in remote communities and lived alongside Indigenous people for months. Valderrama Fernández’s fluency in abstract Quechua, which included theological and philosophical concepts and terms, helped him speak freely with Andean elders and understand how they adapted their ancient cosmology to the present. His love of the language, which he learned from his grandmother, never diminished. “That’s what made him special,” Escalante Gutiérrez says.

Valderrama Fernández taught for 30 years at his alma mater, the National University of Saint Anthony the Abad in Cusco. In his final years, he embarked on a second career in politics, advocating for the region’s Indigenous people. In 2006, he was elected to the municipal council of his hometown; in December 2019, he became interim mayor of Cusco, after his predecessor left office under a cloud of corruption charges.

In his new role, Valderrama Fernández led the COVID-19 response in Cusco, visiting markets and other areas of the city to enforce health measures. He caught the virus on one of those visits, and died on 30 August at 75 years old.

Jan Szemiński, a historian at the Hebrew University of Jerusalem, says the world has lost a great anthropologist—and someone who embodied the Incan ideal of reciprocity, or ayni: the idea that you should give to others today—knowing that tomorrow, you will receive.

John Horton Conway


John Horton Conway liked to have fun. The U.K.-born mathematician cut a broad path, making important contributions to geometry, group theory, and topology. But unlike some great mathematicians who grind away on inscrutable problems in jealously guarded isolation, Conway—who worked at the University of Cambridge and Princeton University—was gregarious, talkative, and, above all, playful, often drawing deep insights from mathematical games.

In the 1970s, while musing about the end stage of the board game Go, Conway expanded the concept of real numbers into something called “surreal numbers,” which are smaller or larger than any positive number. In 1985, he and four colleagues essentially wrapped up an entire subfield of math by identifying all groups with a finite number of elements. (A group is a closed set of elements and a rule akin to addition or multiplication for combining them—for example, all rotations that leave the image of a featureless cube the same.)

Most famously, in 1970 Conway invented something he called the game of life. Imagine a grid of squares, some colored black for “living,” others colored white for “dead,” with rules for changing a square’s color that depend on those of its neighbors. The simple system can produce a shocking variety of moving patterns depending on its initial configuration, and the game became popular as computers made their way into everyday life. Conway showed the squares could also be configured to do computations.

As impressive as Conway’s genius was his generosity of spirit, says Marjorie Senechal, a mathematician at Smith College. In the 1990s, she helped organize summer geometry institutes to build bridges among professional mathematicians, math teachers, and students. The first few summers, the pros simply lectured the others, Senechal says. Then she invited Conway, and everything clicked. “He didn’t see these as separate communities,” she says. “He was like the Pied Piper. He’d go to get a coffee and a hundred people would follow him.”

Conway, who died in April at age 82, would prowl the Princeton math department at night, chatting with anyone he could find about his latest interest, recalls Timothy Hsu, a mathematician at San Jose State University who earned his doctorate with Conway in 1995. Unkempt and funny, Conway studiously ignored his mail, but could be reached by phone—in the department common room. “Towards the end of my graduate career, he told me that because math is such a forbidding subject, it helps to make yourself slightly ridiculous,” Hsu says. Conway then teased, “That seems to come naturally to you.”

Donald Kennedy


Neurobiologist Donald Kennedy brought a towering intellect, insatiable curiosity, and abiding interest in both the concerns of individuals and the fate of society to everything he did. The longtime faculty member and former president of Stanford University “could talk to people about science without condescending to them,” says research advocate Thomas Grumbly, a friend and colleague. “And he could stand toe to toe with the best scientists in the world.”

Kennedy, who died on 21 April at age 88, relished his role as a scientist, educator, public servant, and communicator—even when his views did not prevail. After Congress refused to embrace his proposed ban on the artificial sweetener saccharine while he was commissioner of the U.S. Food and Drug Administration in the late 1970s, he questioned its logic. The body had “established a principle,” Kennedy said. “You shouldn’t have cancer-causing substances in the food supply, unless people like them a lot.”

That dry wit did him no favors in a subsequent fight with a congressional panel investigating Stanford’s questionable use of federal research funds during his tenure as president. The fallout from that grueling inquiry led him to step down from the presidency in 1991.

In 2000, Kennedy became editor-in-chief of Science. He used the platform to prod climate researchers to work harder on public outreach, condemn politicians who bent—or ignored—scientific findings to serve their own purposes, and publish the best research on the planet, including the first sequence of the human genome.

Kennedy had been a larger-than-life figure at Stanford, whether dashing around campus on his bike or posing bare-chested with the championship swim team. He brought that enthusiasm to the journal, where he also liked to shine a light on the personal side of science.

One of his editorials accompanied a 2005 paper describing a sighting of the ivory-billed woodpecker, long thought to be extinct. Kennedy recounted how, at age 7, he wrote a “fan letter” to famed Cornell University ornithologist Arthur Allen about Allen’s pursuit of the fabled bird. The letter, signed “Love, Donny,” prompted a reply that ended “Love, Arthur.”

The woodpecker sighting didn’t hold up to scrutiny. But Kennedy’s point did: that an encouraging word from a senior scientist could have a lasting impact on a curious child. In fact, one could say Kennedy spent his entire career paying forward that kindness.


First CRISPR cures?

Sickled blood cells (foreground) have been fixed—at least temporarily—by the gene-editing tool CRISPR. SCIENCE PICTURE CO/SCIENCE SOURCE

Since the revolutionary genome-snipping tool known as CRISPR burst on the scene in 2012, it has given researchers new power to engineer crops and animals, stirred ethical debates, and earned a Nobel Prize—not to mention Science’s Breakthrough of the Year in 2015. Now, CRISPR is again making waves, scoring its first success in the clinic by treating two inherited blood diseases.

People with beta-thalassemia have low levels of the oxygen-carrying hemoglobin protein, leading to weakness and exhaustion; those with sickle cell disease make a defective form of the protein, resulting in sickle-shaped red blood cells that block blood vessels and often cause severe pain, organ damage, and strokes.

To treat three sickle cell patients, researchers harvested immature blood cells, known as blood stem cells, from each. They then used CRISPR to disable an “off” switch that—in adults—stops production of the fetal form of hemoglobin, which can counter the effects of the sickling mutation. After the patients received chemotherapy to wipe out their diseased blood stem cells, the CRISPR-treated cells were infused back into their bodies.

The patients, treated up to 17 months ago, are now making plentiful fetal hemoglobin, and have not experienced the painful attacks that used to strike every few months, the companies CRISPR Therapeutics and Vertex Pharmaceuticals reported in December. One patient, a young mother of three, says the treatment changed her life. The companies also gave the treatment to seven patients who normally receive blood transfusions for beta-thalassemia. They haven’t needed transfusions since, the companies reported in the same paper and meeting presentation. With more testing, the new treatment could rival the success of gene therapies that treat the two diseases by adding hemoglobin DNA to stem cells. But like gene therapy, the CRISPR approach requires high-tech medical care and could cost $1 million or more per patient—putting it out of reach for much of Africa, where most people with sickle cell live.


H. Frangoul et al.CRISPR-Cas9 Gene Editing for Sickle Cell Disease and β-ThalassemiaThe New England Journal of Medicine, 5 December 2020

J. Kaiser, Tweaking genes with CRISPR or viruses fixes blood disordersScience, Vol. 370, p. 1254, 11 December 2020

Scientists speak up for diversity


Within days of a racially charged confrontation between a white dog owner and a Black birdwatcher in New York City’s Central Park in late May, scientists flocked to Twitter to celebrate—and support—Black nature enthusiasts. The #BlackBirdersWeek hashtag was soon followed by others, in disciplines from neuroscience to physics, all aiming to create community among Black scientists on Twitter, Zoom, and other platforms. “We’re few and far between, so having us come together as a conglomerate in one virtual space—it really helped,” says Ti’Air Riggins, a biomedical engineering Ph.D. student at Michigan State University who helped organize #BlackInNeuro week.

The social media events took place against the backdrop of the anguished response to police killings in the United States, the Black Lives Matter movement, and discussions within science about the need to create a more equitable, welcoming environment for people of color. Through those discussions, many scientists hoped to reach colleagues who had paid little attention to these issues in the past. “People of color across the board are struggling,” says Tanisha Williams, a botanist at Bucknell University who spearheaded #BlackBotanistsWeek. “It’s a systemic problem.”

Although it’s too early to tell whether the events of this year will spur lasting change, many are hopeful. “This year feels different,” says Shirley Malcom, a senior adviser at AAAS (publisher of Science) who has worked on diversity, equity, and inclusion issues since the 1970s. “All of a sudden, after George Floyd and everything else that came out after that time, you could at least get people’s attention,” she says—adding that many scientists now seem more open to the idea that systemic racism is a problem in their community.

“I definitely feel like our voices are being heard, and in a different way [than before],” Williams says. “But it’s not going to be a quick fix … we have a long road.”


K. Langin, ‘I can’t even enjoy this.’ #BlackBirdersWeek organizer shares her struggles as a black scientistScience, 5 June 2020

S. Chen, Researchers around the world prepare to #ShutDownSTEM and ‘Strike For Black Lives’Science, 9 June 2020

G. Barabino, Systemic equity in educationScience, Vol. 369, p. 1277, 11 September 2020

N. Lewis, What I’ve learned about being a Black scientistScience, 16 July 2020

Global warming forecasts sharpen

Clouds are no longer expected to significantly dampen global warming. ISS EXPEDITION 7 CREW/EOL/NASA

More than 40 years ago, the world’s leading climate scientists gathered in Woods Hole, Massachusetts, to answer a simple question: How hot would Earth get if humans kept emitting greenhouse gases? Their answer, informed by rudimentary climate models, was broad: If atmospheric carbon dioxide (CO2) doubled from preindustrial levels, the planet would eventually warm between 1.5°C and 4.5°C, a climate sensitivity range encompassing the merely troubling and the catastrophic. Now, they’ve finally ruled out the mildest scenarios—and the most dire.

Narrowing those bounds has taken decades of scientific advancement. Understanding how clouds trap or reflect heat has been a particular challenge. Depending on their thickness, location, and composition, clouds can amplify warming—or suppress it. Now, high-resolution cloud models, supported by satellite evidence, have shown that global warming thins low, light-blocking clouds: Hotter air dries them out and subdues the turbulence that drives their formation.

Longer and better temperature records have also helped narrow the range. Studies of Earth’s ancient climate, which estimate paleotemperatures and CO2 levels using ice and ocean sediment cores, suggest how greenhouse gases may have driven previous episodes of warming. And modern global warming has now gone on long enough that surface temperatures, 1.1°C hotter than in preindustrial times, can be used to more confidently project trends into the future.

This year, these advances enabled 25 scientists affiliated with the World Climate Research Programme to narrow climate sensitivity to a range between 2.6°C and 3.9°C. The study rules out some of the worst-case scenarios—but it all but guarantees warming that will flood coastal cities, escalate extreme heat waves, and displace millions of people.

If we’re lucky, such clarity might galvanize action. Atmospheric CO2 is already at 420 parts per million—halfway to the doubling point of 560 ppm. Barring more aggressive action on climate change, humanity could reach that threshold by 2060—and lock in the foreseen warming.


P. Voosen, Earth’s climate destiny finally seen more clearlyScience, Vol. 369, p. 354, 24 July 2020

S. C. Sherwood et al.An Assessment of Earth's Climate Sensitivity Using Multiple Lines of EvidenceReviews of Geophysics, 22 July 2020

P. Voosen, New climate models forecast a warming surgeScience, Vol. 364, p. 222, 19 April 2019

Found: elusive source of fast radio bursts

Magnetars are neutron stars with magnetic fields 100 million times stronger than that of any magnet on Earth. © PITRIS/DREAMSTIME.COM

Everyone loves a good mystery. Take fast radio bursts (FRBs)—short, powerful flashes of radio waves from distant galaxies. For 13 years, they tantalized astronomers keen to understand their origins. One running joke said there were more theories explaining what causes FRBs than there were FRBs. (Currently, astronomers know of more than 100.)

Now, cosmic sleuths have fingered a likely culprit: magnetars, neutron stars that fizzle and pop with powerful magnetic fields. Because FRBs are so fast, they must come from a small but intense energy source like a magnetar, which are formed when burned-out stars collapse to the size of a city. But although a handful of FRBs had been traced to particular galaxies, no telescope had sharp enough vision to connect them to an individual magnetar at such great distances.

Then, in April, an FRB went off in the Milky Way—close enough that astronomers could examine the scene. The Canadian Hydrogen Intensity Mapping Experiment, a pioneering survey telescope in British Columbia responsible for the discovery of many FRBs, narrowed the source to a small area of sky, which was soon confirmed by the U.S. radio array STARE2. Orbiting observatories sensitive to higher frequencies quickly found that a known magnetar in that part of the sky, called SGR 1935+2154, was acting up at the same time, spewing out bursts of x-rays and gamma rays.

Although astronomers studying FRBs believe they have finally found their perpetrator, they still don’t know exactly how magnetars produce the radio bursts. They could come from close to the magnetar’s surface, as magnetic field lines break and reconnect—similar to the Sun’s flaring behavior. Or they could come from farther out, as shock waves slam into clouds of charged particles and generate laserlike radio pulses. Stay tuned for a sequel: Crack theorists are on the case.


CHIME/FRB Collaboration, A bright millisecond-duration radio burst from a Galactic magnetarNature, Vol. 587, p. 54, 4 November 2020

C. D. Bochenek et al.A fast radio burst associated with a Galactic magnetarNature, Vol. 587, p. 59, 4 November 2020

L. Lin et al.No pulsed radio emission during a bursting phase of a Galactic magnetarNature, Vol. 587, p. 63, 4 November 2020

D. Clery, Galactic flash points to long-sought source for enigmatic radio burstsScience, 8 June 2020

D. Clery, Flashes in the scanScience, Vol. 363, p. 1138, 15 March 2019

World’s oldest hunting scene revealed

A painting on an Indonesian cave wall shows tiny hunters corralling a dwarf buffalo with ropes or spears. RATNO SARDI

More than 40,000 years ago on the Indonesian island of Sulawesi, a prehistoric Pablo Picasso ventured into the depths of a cave and sketched a series of fantastic animal-headed hunters cornering wild hogs and buffaloes. The age of the paintings, pinned down just 1 year ago, makes them the earliest known figurative art made by modern humans.

In 2017, when an Indonesian researcher chanced across the scene, the figures alone told him he had found something special. The animals appear to be Sulawesi warty pigs and dwarf buffaloes, both of which still live on the island. But it was the animallike features of the eight hunters, armed with spears or ropes, that captivated archaeologists. Several of the hunters seem to have long muzzles or snouts. One sports a tail. Another’s mouth resembles a bird beak.

It’s possible the artist was depicting the hunters wearing masks or camouflage, the researchers say, but they may also represent mythical animal-human hybrids. Such hybrids appear in other ancient works of art, including a 35,000-year-old ivory figurine of a lion-man found in the German Alps.

Parts of the paintings were covered in white, bumpy mineral deposits known as cave popcorn. Uranium in this popcorn decays at a fixed rate, which allowed researchers to date minerals on top of the pigment to about 44,000 years ago. The cave scene must be at least that old—about 4000 years older than any other known figurative rock art, they reported in late December 2019. It decisively unseats Europe as the first place where modern humans are known to have created figurative art.

If the figures do depict mythical human-animal hunters, their creators may have already passed an important cognitive milestone: the ability to imagine beings that do not exist. That, the researchers say, forms the roots of most modern—and ancient—religions.


N. Conard, Palaeolithic ivory sculptures from southwestern Germany and the origins of figurative artNature, Vol. 426, p. 830, 18 December 2003

M. Aubert et al.The Timing and Nature of Human Colonization of Southeast Asia in the Late Pleistocene: A Rock Art PerspectiveCurrent Anthropology, Vol. 58, p. S553, 15 November 2017

D. L. Hoffmann et al.U-Th dating of carbonate crusts reveals Neandertal origin of Iberian cave artScience, Vol. 359, p. 912, 23 February 2018

M. Price, World’s oldest hunting scene shows half-human, half-animal figures—and a sophisticated imaginationScience, 11 December 2019

AI disentangles protein folding

Structures of a protein that were predicted by artificial intelligence (blue) and experimentally determined (green) match almost perfectly. DEEPMIND

For 5 decades, scientists have struggled to solve one of biology’s biggest challenges: predicting the precise 3D shape a string of amino acids will fold into as it becomes a working protein. This year, they achieved that goal, developing an artificial intelligence (AI) program that predicts most protein structures as accurately as laboratory experiments can map them. Because a protein’s precise shape determines its biochemical functions, the new program could help researchers uncover mechanisms of disease, develop new drugs, and even create drought-tolerant plants and cheaper biofuels.

Researchers traditionally decipher structures using laborious techniques such as x-ray crystallography and cryo–electron microscopy. But detailed molecular maps only exist for about 170,000 of the 200 million known proteins. Computational biologists have dreamed of simply predicting a protein’s structure by modeling the amino acid interactions that govern its 3D shape. But because amino acids can interact in so many ways, the number of possible structures for single protein is astronomical.

In 1994, structural biologists launched a biennial competition called the Critical Assessment of Protein Structure Prediction (CASP). Entrants are given amino acid sequences for about 100 proteins with as-yet-unknown structures. Some groups try to predict their structures, while others map the same structures in the lab; afterward, their results are compared. Even in CASP’s early years, the predictions for small, simple proteins were on par with experimental observations. But predictions for larger, more challenging proteins lagged far behind.

Not anymore. This year, an AI program created by researchers at U.K.-based DeepMind tallied a median score of 92.4 on a 100-point scale, where anything above 90 is considered as accurate as an experimentally derived structure. On the most challenging proteins, the AlphaFold program averaged 87, 25 points ahead of its closest competitor. And because contest rules require competitors to reveal enough of their methods for others to make use of them, organizers say it’s only a matter of months before other groups match AlphaFold’s success.


R. F. Service, ‘The game has changed.’ AI triumphs at protein foldingScience, Vol. 370, p. 1144, 4 December 2020

R. F. Service, Google’s DeepMind aces protein foldingScience, 6 December 2018

How elite controllers keep HIV at bay

HIV (dark blue) inserts itself into host DNA. JANET IWAS/CC BY NC SA

HIV, like all retroviruses, has a nasty feature that allows it to dodge attack: It integrates its genetic material into human chromosomes, creating “reservoirs” where it can hide, undetected by the immune system and invulnerable to antiretroviral drugs. But where it hides may make all the difference.

This year, a study of 64 HIV-infected people who have been healthy for years without antiretroviral drugs reveals a link between their unusual success and where the virus has hunkered down in their genomes. Although the new understanding of these “elite controllers” won’t lead directly to a cure, it opens up a novel strategy that may routinely allow other infected people to live for decades without treatment.

Many studies have examined elite controllers, who make up about 0.5% of the 38 million people living with HIV. But this new work stood apart in size and scope, comparing integrated HIV in the 64 elite controllers with that in 41 HIV-infected people on treatment. HIV does best when it slots itself within genes. When the cell transcribes the genes, the integrated HIV, or “provirus,” can produce new viruses that infect other cells. If it parks in “gene deserts,” portions of chromosomes that rarely transcribe DNA, the provirus sits around like a fully functioning car stuck in a place that doesn’t sell gas.

The study found that in the elite controllers, 45% of functioning proviruses resided in gene deserts, compared with just 17.8% for the people on treatment. Presumably, immune responses in the elite controllers somehow cleared proviruses from the more dangerous parking spots. Now, the challenge is to figure out interventions that will train the immune systems of the vast majority of people living with HIV to behave similarly.

That new insight suggests long-standing, frustrating attempts to cure people by eliminating HIV reservoirs may be too ambitious an approach. Instead, success may depend on shrinking—and then making peace with—these reservoirs, and minding the old real estate dictum of location, location, location.


C. Jiang et al.Distinct viral reservoirs in individuals with spontaneous control of HIV-1Nature, Vol. 585, p. 261, 26 August 2020

J. Cohen, How ‘elite controllers’ tame HIV without drugsScience, 26 August 2020

Room temperature superconductivity finally achieved

Crushed between two diamonds, a compound of hydrogen, sulfur, and carbon superconducts at room temperature. ADAM FENSTER

Scientists have spent decades searching for materials that conduct electricity without resistance at room temperature. This year they found the first one, a hydrogen- and carbon-containing compound squeezed to a pressure approaching that at the center of Earth. The discovery is setting off a hunt for room temperature superconductors that work at typical surface pressures; such materials could transform technologies and save the vast amounts of energy wasted when electricity moves through wires.

Superconductivity got its start in 1911 when physicist Heike Kamerlingh Onnes found that a mercury wire chilled to 4.2°C above absolute zero, or 4.2 K, conducted electrons without the usual heat-producing friction. In 1986, researchers found the same was true of a family of copper oxide ceramics. Because these superconductors worked above 77 K—the temperature of liquid nitrogen—they spawned a new generation of MRI machines and particle accelerator magnets. There were hints that copper oxides might superconduct at room temperature, but they were never verified.

Confirmation now comes from high-pressure physics, in which scientists smash flecks of materials between the flattened points of two diamonds at pressures millions of times higher than those at Earth’s surface. With such a diamond anvil, researchers in Germany in 2019 compressed a mix of lanthanum and hydrogen to 170 gigapascals (GPa), yielding superconductivity at temperatures up to 250 K, just under the freezing point of water. This year, researchers in the United States topped that result with a hydrogen, carbon, and sulfur compound compressed to 267 GPa. It conducted without resistance to 287 K, the temperature of a chilly room.

So far, the new superconductors fall apart when the pressure is released. But the same isn’t true of all high-pressure materials: Diamonds born in the crushing depths of Earth, for example, survive after rising to the surface. Now, researchers hope to find a similarly long-lasting gem for their own field.


R. F. Service, At last, room temperature superconductivity achievedScience, Vol. 370, p. 273, 16 October 2020

A. P. Drozdov et al., Conventional superconductivity at 203 kelvin at high pressures in the sulfur hydride systemNature, Vol. 525, p. 73, 17 August 2015

E. Snider et al., Room-temperature superconductivity in a carbonaceous sulfur hydrideNature, Vol. 586, p. 373, 14 October 2020

Birds are smarter than you think

This homing pigeon may have the necessary neural anatomy for consciousness. RUTH SWAN/ALAMY STOCK PHOTO

Their eyes are beady and their brains are no bigger than a walnut. But two studies published this year suggest birds have startling mental powers. One reveals that part of the avian brain resembles the human neocortex, the source of human intelligence. The other shows that carrion crows are even more aware than researchers had thought—and may be capable of some conscious thought.

In humans, the neocortex consists of horizontal layers laced with interconnected columns of nerve cells, which allow for complex thinking. Bird brains, in contrast, were thought to be arranged in simple clusters of nerve cells. By using a technique called 3D polarized light imaging, neuroanatomists took a closer look at the forebrain of homing pigeons and owls and found that nerves there connect both horizontally—like the layers in the neocortex—and vertically, echoing the columns seen in human brains.

Another team of scientists probed this part of the brains of carrion crows—well-known for their intelligence—for clues that they are aware of what they see and do. The researchers first trained lab-raised crows to turn their heads when they saw certain sequences of lights flashing on a computer monitor. Electrodes in the crows’ brains detected nerve activity between the moment the birds saw the signal and when they moved their heads. The activity developed even when the lights were barely detectable, suggesting it was not simply a response to sensory input, and it was present regardless of whether the birds reacted. The scientists think the neural chatter represents a kind of awareness—a mental representation of what the birds saw.

Such “sensory consciousness” is a rudimentary form of the self-awareness that humans experience. Its presence in both birds and mammals suggests to the researchers that some form of consciousness may date back 320 million years, to our last common ancestor.


A. Nieder et al., A neural correlate of sensory consciousness in a corvid birdScience, Vol. 369, p. 1626, 25 September 2020

S. Herculano-Houzel, Birds do have a brain cortex—and thinkScience, Vol. 369, p. 1567, 25 September 2020

V. Morell, Newfound brain structure explains why some birds are so smart—and maybe even self-awareScience, 24 September 2020

M. Stacho et al.A cortex-like canonical circuit in the avian forebrainScience, Vol. 369, 25 September 2020