Book Review: The Precarious Dance of Humans and Microbes
To beat infectious disease, humans must shed their hubris, suggests Thomas Levenson in “So Very Small.”
- 10 June 2025
- 7 min read
- by Lina Zeldovich

In 1676, Dutch fabric merchant Antonie van Leeuwenhoek held a magnifying glass to a sample of rainwater and saw a slew of tiny creatures squirming around. The first to observe such microscopic beings, he named them animalcules, later finding them in freshwater, seawater, and well water. As he watched them in action, he made detailed drawings and descriptions of their movements. Some, he wrote, looked like “very tiny eels,” some “put forth two little horns continually moving themselves,” while others made “serpent like” movements. Soon, he sent his writings to The Royal Society of London, to share his uncanny discoveries with learned men.
Why was it so hard to see that bacteria play a role in illness?
Thus begins Thomas Levenson’s thought-provoking book, “So Very Small: How Humans Discovered the Microcosmos, Defeated Germs — and May Still Lose the War Against Infectious Disease.” A professor of science writing at the Massachusetts Institute of Technology, Levenson walks us through nearly 350 years of scientific inquiry into the microscopic world, chronicling the precarious dance of humans with their mass murderers: microbes. For many centuries, that dance often ended with the microbes winning. But with the emergence in the late 19th century of germ theory, which postulates that infectious diseases are caused by microorganisms, it led humans to create effective modern medicines.

In that narrative, Levenson points out numerous opportunities to identify the disease culprits that scientists and medics botched, missed, or dismissed — despite being aware of the microscopic creatures with which they co-existed. Van Leeuwenhoek’s drawings, for example, were detailed enough that “modern microbiologists can identify the types of bacteria he saw, including the families of organisms now known to cause disease,” Levenson writes. “Yet for most of the next two hundred years, the notion that such creatures might have something to do with human suffering was overlooked, dismissed as speculation, occasionally tentatively entertained — and rejected up to the middle of the nineteenth century. Why was it so hard to see that bacteria play a role in illness?”
It’s arguably hard to pinpoint sources of contagion without modern equipment, especially since different infections spread differently. Measles can be airborne, smallpox spreads by close contact, cholera travels through contaminated water or feces, while malaria and plague are transmitted through the respective bites of germ-carrying mosquitoes and fleas. Yet Levenson argues that the biggest obstacles were humans themselves.
The learned men of London took van Leeuwenhoek’s finding seriously, yet they didn’t link the animalcules to the plague that depopulated the city about a decade earlier, or any other maladies, because of their Christian worldview. As Levenson writes, the so-called scala naturae, or the Great Chain of Being, had “placed humans not within the natural world, but outside — above — the natural world’s web of connections and interactions.” Implicating the puny organisms in people’s demise was unmanageable. Instead, leading minds embraced miasma theory, which postulated that infections propagate by noxious air and blamed the poor for squalid living conditions where diseases could fester. “Naming filth as the source of the scourge was not just a medical explanation; it supported, even reinforced, existing ideas about virtue, value, and social order,” Levenson writes.
Physicians once again had the opportunity to link diseases and germs in 1721 when American minister Cotton Mather tried vaccinating Bostonians to stop the growing smallpox epidemic. Mather had read a 1714 paper by physician Emanuel Timonius that described vaccination — inoculation with a milder version of a pathogen — as a grassroots movement practiced for centuries in Asia and parts of Africa: First, a doctor would find a young man successfully battling smallpox. Then, “using a needle, the practitioner would poke a pustule and squeeze out some of the ooze,” Levenson writes. He would scratch the skin on a healthy person’s arm and mix the ooze in.
Mather corroborated his idea by a statement from his enslaved servant, a man named Onesimus who had been vaccinated that way in Libya, his home country, before enslavement. “Whoever had the Courage to use it was forever free of the contagion,” Mather quoted Onesimus. His colleagues were vehemently against the practice, so as Mather inoculated courageous Bostonians, others tossed a grenade through his window, which by sheer miracle didn’t go off. “Anti-vaccine terrorism is no new thing,” Levenson comments.
Even when the evidence stared medics in the face, they didn’t want to see it. That was the case with puerperal or childbed fever, which erupted in Europe in the 17th century — when women began to give birth in hospitals rather than at home — killing some three-quarters of mothers with symptoms of the ailment. Doctors blamed it on “some fault within their patients,” but the real culprit was streptococcus, passed by the unsterilized hands of the attending physicians or nurses. In the late 1700s, British physician Alexander Gordon realized that the fever struck only those “visited or delivered by a practitioner, or taken care of by a nurse, who had previously attended patients affected with the disease.” Gordon’s published treatise offended the medical community and cost him his practice; he had to leave town.
Levenson points out numerous opportunities to identify the disease culprits that scientists and medics botched, missed, or dismissed — despite being aware of the microscopic creatures with which they co-existed.
About a half century later in the 1840s, Viennese physician Ignaz Semmelweis suffered a similar fate. During an autopsy performed on a woman who succumbed to puerperal fever, a medical student accidentally nicked a professor with a surgical knife, unwittingly infecting him. When the professor died, the autopsy revealed that his insides were ravaged by the fever, so Semmelweis blamed the “cadaverous particles introduced into his vascular system.” He ordered everyone moving between the anatomy lab and the hospital wards to wash with a chlorine solution, and the number of maternal deaths plummeted instantly. Attacked by his colleagues, he ended up in an insane asylum where, beaten by the attendants, he died from an infection. Meanwhile, “Europe’s doctors continued to kill Europe’s women,” Levenson writes.
Only in the mid-1800s, when Louis Pasteur observed that microbes could spoil batches of wine, did he finally pave the way to germ theory. British surgeon Joseph Lister took this further, postulating that bacteria can similarly wreak havoc in a human body. He began treating patient wounds and incisions with carbolic acid, a disinfectant, which significantly reduced surgical infections and deaths. Then, in 1875, German physician and microbiologist Robert Koch demonstrated how anthrax bacteria could kill a healthy animal within a day. Pasteur picked up the anthrax torch, which, from that point on, served as “a model for a host of other diseases,” Levenson writes.
Germ theory has finally germinated enough to take hold.
Have you read?
Once the bacterial enemies were clear, humans did a decent job of fighting them — at first. In the 1930s, German bacteriologist Gerhard Domagk formulated the first sulfa drug — a type of antibiotic. The medication saved one of President Franklin D. Roosevelt’s sons from streptococcus, and thousands from World War II battle wound infections and venereal disease. It was “a savior from the battlefield to the brothel,” Levenson writes. In the early 1940s, scientists Howard Florey and Ernst Chain managed to mass-produce penicillin, discovered in 1928 by Scottish physician Alexander Fleming. For the first time in history, “reason and technological mastery could replace God as the guarantors of humankind’s place atop the ranked order of nature,” Levenson writes. For a few decades, it seemed humans were winning the war.
Except we weren’t. Mesmerized by the miracle drugs, humans overused them in agriculture and medicine, leading to antibiotic resistance, a phenomenon Fleming warned against in his 1945 Nobel Prize speech. Humans underestimated how savvy microbes are in their ability to evolve and share their defense tricks to resist antibiotics. We made the same mistake as before: We thought we were above the natural world.
Levenson does an artful job of weaving the little-known historical facts into a suspenseful narrative with unexpected twists and turns. Yet, the suspense becomes uncomfortable — and even terrifying — as he paints a not-so-rosy picture of our future. Since germ theory emerged, we’ve arguably been playing catchup with our microbial defenses rather than staying one step ahead of the bugs. Now we’re losing speed because, as Levinson points out, the antibiotic pipelines are running dry. That’s why “So Very Small” is one of those books that stick with you long after you read it — it makes you realize that our victory in the bacterial war is far from guaranteed.
Although Levenson spends significantly more time on history rather than solutions for our precarious situations, he does outline a few. As he writes, developing new antibiotics should be a research priority. Countries should join forces in overseeing antibiotics use: “We should move to manage antibiotics as if they were a global commons — like the atmosphere.” But perhaps most importantly humans must recognize that “we are utterly enmeshed in natural systems,” he writes, ending on the poignant note: “We live not above the living world, but in it.”