1: Decisions
The source of human power is neither muscle nor mind but models
Some threats are sudden and unexpected. Others are slow and smoldering. Both represent cognitive blind spots for which societies are unprepared. Whether pandemics or populism, new weapons or new technologies, global warming or gaping inequalities, how humans respond marks the difference between survival and extinction. And how we act depends on what we see.
Each year, more than 700,000 people around the world die from infections that antibiotics once cured but no longer do. The bacteria have developed resistance. The number of deaths is rising fast. Unless a solution is found, it is on track to hit ten million a year, or one person every three seconds. It makes even the tragedy of Covid-19 pale by comparison. And it is a problem that society itself has produced. Antibiotics work less and less well due to overuse: the very drugs that could once staunch the bacteria have turned them into superbugs.
We take antibiotics for granted, but before penicillin was discovered in 1928 and mass-produced more than a decade later, people routinely died from broken bones or simple scratches. In 1924, the sixteen-year-old son of American president Calvin Coolidge got a blister on his toe while playing tennis on the White House lawn. It became infected, and he died within the week—neither his status nor wealth could save him. Today, almost every aspect of medicine, from a C-section to cosmetic surgery to chemotherapy, relies on antibiotics. If their power were to wane those treatments will become far riskier.
From her colorful, plant-strewn office in Cambridge, Massachusetts, Regina Barzilay, a professor of artificial intelligence at MIT, envisioned a solution. Conventional drug development mostly focuses on finding substances with molecular "fingerprints" similar to ones that work. That generally performs well, but not for antibiotics. Most substances with similar compositions have already been examined, and new antibiotics are so close in structure to existing ones that bacteria quickly develop resistance to them, too. So Barzilay and a diverse team of biologists and computer scientists, led by Jim Collins, a professor of bioengineering at MIT, embraced an alternative approach. What if, instead of looking for structural similarities, they focused on the effect: Did it kill bacteria? They reconceived the problem not as a biological one but an informational one.
Charismatic and confident, Barzilay doesn't come across as a typical nerd. But then, she is accustomed to defying categories. She grew up under communism in what is now Moldova, speaking Russian; was educated in Israel, speaking Hebrew; and attended grad school in America. In 2014, as a new mother in her early forties, she was diagnosed with breast cancer, which she survived after difficult treatments. This ordeal led her to change her research in order to focus on artificial intelligence in medicine. As her research gained attention, a MacArthur "genius grant" followed.
Barzilay and the team got to work. They trained an algorithm on more than 2,300 compounds with antimicrobial properties, to find if any inhibited the growth of E. coli, a noxious bacterium. Then the model was applied to around six thousand molecules in the Drug Repurposing Hub and later to more than one hundred million molecules in another database to predict which might work. In early 2020 they struck gold. One molecule stood out. They named it "halicin" after HAL, the renegade computer in 2001: A Space Odyssey.
The discovery of a superdrug to kill superbugs made headlines around the world. It was hailed as a "video killed the radio star" moment for the superiority of machine over man. "AI Discovers Antibiotics to Treat Drug-Resistant Diseases," boomed a front-page headline in the Financial Times.
But that missed the real story. It wasn't a victory for artificial intelligence but a success of human cognition: the ability to rise up to a critical challenge by conceiving of it in a certain way, altering aspects of it, which open up new paths to a solution. Credit does not go to a new technology but to a human ability.
"Humans were the ones who selected the right compounds, who knew what they were doing when they gave the material for the model to learn from," Barzilay explains. People defined the problem, designed the approach, chose the molecules to train the algorithm, and then selected the database of substances to examine. And once some candidates popped up, humans reapplied their biological lens to understand why it worked.
The process of finding halicin is more than an outstanding scientific breakthrough or a major step toward accelerating and lowering the cost of drug development. To succeed, Barzilay and the team needed to harness a form of cognitive freedom. They didn't get the idea from a book, from tradition, or by connecting obvious dots. They got it by embracing a unique cognitive power that all people possess.
1: Decisions
The source of human power is neither muscle nor mind but models
Some threats are sudden and unexpected. Others are slow and smoldering. Both represent cognitive blind spots for which societies are unprepared. Whether pandemics or populism, new weapons or new technologies, global warming or gaping inequalities, how humans respond marks the difference between survival and extinction. And how we act depends on what we see.
Each year, more than 700,000 people around the world die from infections that antibiotics once cured but no longer do. The bacteria have developed resistance. The number of deaths is rising fast. Unless a solution is found, it is on track to hit ten million a year, or one person every three seconds. It makes even the tragedy of Covid-19 pale by comparison. And it is a problem that society itself has produced. Antibiotics work less and less well due to overuse: the very drugs that could once staunch the bacteria have turned them into superbugs.
We take antibiotics for granted, but before penicillin was discovered in 1928 and mass-produced more than a decade later, people routinely died from broken bones or simple scratches. In 1924, the sixteen-year-old son of American president Calvin Coolidge got a blister on his toe while playing tennis on the White House lawn. It became infected, and he died within the week—neither his status nor wealth could save him. Today, almost every aspect of medicine, from a C-section to cosmetic surgery to chemotherapy, relies on antibiotics. If their power were to wane those treatments will become far riskier.
From her colorful, plant-strewn office in Cambridge, Massachusetts, Regina Barzilay, a professor of artificial intelligence at MIT, envisioned a solution. Conventional drug development mostly focuses on finding substances with molecular "fingerprints" similar to ones that work. That generally performs well, but not for antibiotics. Most substances with similar compositions have already been examined, and new antibiotics are so close in structure to existing ones that bacteria quickly develop resistance to them, too. So Barzilay and a diverse team of biologists and computer scientists, led by Jim Collins, a professor of bioengineering at MIT, embraced an alternative approach. What if, instead of looking for structural similarities, they focused on the effect: Did it kill bacteria? They reconceived the problem not as a biological one but an informational one.
Charismatic and confident, Barzilay doesn't come across as a typical nerd. But then, she is accustomed to defying categories. She grew up under communism in what is now Moldova, speaking Russian; was educated in Israel, speaking Hebrew; and attended grad school in America. In 2014, as a new mother in her early forties, she was diagnosed with breast cancer, which she survived after difficult treatments. This ordeal led her to change her research in order to focus on artificial intelligence in medicine. As her research gained attention, a MacArthur "genius grant" followed.
Barzilay and the team got to work. They trained an algorithm on more than 2,300 compounds with antimicrobial properties, to find if any inhibited the growth of E. coli, a noxious bacterium. Then the model was applied to around six thousand molecules in the Drug Repurposing Hub and later to more than one hundred million molecules in another database to predict which might work. In early 2020 they struck gold. One molecule stood out. They named it "halicin" after HAL, the renegade computer in 2001: A Space Odyssey.
The discovery of a superdrug to kill superbugs made headlines around the world. It was hailed as a "video killed the radio star" moment for the superiority of machine over man. "AI Discovers Antibiotics to Treat Drug-Resistant Diseases," boomed a front-page headline in the Financial Times.
But that missed the real story. It wasn't a victory for artificial intelligence but a success of human cognition: the ability to rise up to a critical challenge by conceiving of it in a certain way, altering aspects of it, which open up new paths to a solution. Credit does not go to a new technology but to a human ability.
"Humans were the ones who selected the right compounds, who knew what they were doing when they gave the material for the model to learn from," Barzilay explains. People defined the problem, designed the approach, chose the molecules to train the algorithm, and then selected the database of substances to examine. And once some candidates popped up, humans reapplied their biological lens to understand why it worked.
The process of finding halicin is more than an outstanding scientific breakthrough or a major step toward accelerating and lowering the cost of drug development. To succeed, Barzilay and the team needed to harness a form of cognitive freedom. They didn't get the idea from a book, from tradition, or by connecting obvious dots. They got it by embracing a unique cognitive power that all people possess.