Antibiotic resistance is rapidly becoming one of the biggest global health threats of the 21st century, but it is a far from modern phenomenon. In fact, as shown by the recent discovery of antibiotic-resistant bacterial strains in an underground cave that has been isolated from the surface for over four million years, resistance predates not only human use of antibiotics, but the human race itself.

When first discovered more than half a century ago, antibiotics were quickly heralded as a miracle of modern medicine. Along with vaccines, they had one of the biggest impacts in preventing death in the 20th century. But now after decades of clinical use, bacteria are evolving resistance.

Through the systematic overuse and misuse of antibiotics, we’ve created the perfect conditions for resistant strains to outcompete their less resistant relatives and spread rapidly. Last week, the World Health Organization highlighted the problem with World Antibiotic Awareness Week, urging people to limit antibiotic use or risk a world in which minor infections once again become deadly. Yet vaccines, another major medical intervention used to control bacterial disease, has not faced the same problem. Why?

A review published by the Royal Society suggests that there are two fundamental differences between antibiotics and vaccines which could explain this. The first is timing: antibiotics are generally taken once a bacterial infection has already occurred and the bacterial population is large enough to cause disease. At this stage, the bacteria have already multiplied many times. Each time the bacteria divide their DNA is copied, and mistakes in this process can create variation within the population.

This means that by the time a patient takes an antibiotic, the bacterial population is already large and varied enough that a resistant strain is likely to have arisen. It has a greater chance of thriving in this environment since other strains, which it would normally have to compete with, are killed off by the antibiotic. This logic is backed up by studies which show that the larger a microbial population is at the time of treatment, the more likely drug resistance is to evolve.

By contrast, vaccines are administered prior to infection. Their role is to prime the immune system to fight any future infections, so that they can be brought under control before the bacteria have had a chance to multiply.

A second difference is the sophistication of how vaccines defend against bacterial infection. Antibiotic drugs tend to act by targeting one specific bacterial protein or mechanism. In some cases, just one mutation could be enough to alter the target so that it is no longer recognised by the drug – making the bacteria resistant. By contrast, some vaccines can expose the immune system to a huge number of bacterial proteins. This promotes the development of a vast repertoire of antibodies that uses a large number of lines of attack to prevent bacterial infection. The chances of the bacteria simultaneously evolving resistance to attack from every type of antibody produced is slim.

Vaccination can help to combat overuse by preventing infections that would otherwise require antibiotic treatment and help curb the spread of resistant strains. They also provide a crucial defence against diseases that can no longer be treated with antibiotics.

Drug-resistant infections are already thought to cause more than 700,000 deaths per year. By 2050, some estimates suggest that this could rise to 10 million people, killing more people than cancer. Given their greater robustness against resistance than antibiotics, further investment in vaccines will play a vital part in our response against this growing global health threat.