Originals

A Brief History of Vaccination

The practice of immunization can be traced back over 1000 years.

January 27, 2021

By Sabby Kainth

It sounds simple, inject yourself with a trace amount of a bad thing so that it can’t kill you in the future. But where did this wild idea of willingly putting a deadly pathogen into your body first orginate?

Immunization has been a shining light, a true beacon of the age of information and science. A huge number of lives have been saved and pathogens halted in their tracks before they could get an opportunity to unleash ruin. Essentially, a vaccine cons the immune system into producing antibodies to fight a strain of a virus that is not deadly. To put it plainly, the vaccine speaks to the single most noteworthy achievement of biomedicine: preventative immunity.

Our first accounts of inoculation have primitive roots

The practice of immunization can be traced back over a thousand years. Buddhist monks were known to drink snake venom to gain immunity against deadly bites. Inoculation, a word derived from the Latin root “to graft” was the precursor to modern vaccinations. Inoculation referred to the practice of creating a small cut in the skin to insert a weak strain of a virulent pathogen in the hopes of gaining immunity. Most times it was just this — a hope. Inoculation came with many risks, such as the transmission of other deadly bloodborne diseases like syphilis. Forms of inoculation were practiced in Africa, India and China dating before the fifteenth century.

Another way the Chinese may have practiced inoculation was by scratching matter from a smallpox sore into a healthy person’s arm. | The Historical Medical Library of The College of Physicians of Philadelphia

The oldest documented form of “immunization” can be seen in Chinese texts. Legend says, a doctor on Mount Emei came up with the strategy around the eleventh century and was welcomed to the capital, where he effectively vaccinated the child of the Grand Councilor. While the legend may be fabricated, we do know that by the 15th century the prevention of smallpox through a process called variolation was being widely practiced in China.

The idea of introducing trace amounts of a virus into a healthy subject to immunize them against a disease was a slow developing idea. One whose perfection happened over centuries. One of the first steps towards a breakthrough lay in this practice of variolation — the idea that one could introduce small amounts of a poison into a healthy system, in effect becoming immune to the toxicity.

The most common method was “nasal insufflation”, a process that required blowing smallpox scabs up the nostril. A mild case of the infection was selected and 3–4 scabs from the infected individual were ground into a fine powder then bound in cotton. The material was then placed in a pipe and puffed up the nose. The least disgusting strategy was to wear the garments of the infected individual. Whichever strategy was utilized, soon the “beneficiary” would contract a mild form of smallpox. This person was likely to recover and eventually develop resistance to smallpox. While variolation was effective, people still died, particularly if the benefactors developed acute cases.

A tipping point in medicine comes from a unique observation in milkmaids

While variolation had become common practice in Asia, it was still largely regarded as folklore in Europe. Dr. Emmanuel Timoni and his reference to the practice of generating immunity would become the first detailed account of variolation in Europe, and an English doctor by the name of John Fewster observed in 1768 that farmers previously being infected by cowpox provided immunity against smallpox. But it would appear, that the history of modern vaccination officially started with Edward Jenner and his smallpox vaccine in 1796.

The devastation brought on by smallpox relentlessly tore through the sixteenth century, killing millions across the globe. Epidemics of whooping cough, measles, yellow fever and other lethal infectious diseases emphasised the need to tackle contagious illnesses. Virology remained an imprecise science — understanding the mechanisms of transfer would be integral to creating the first vaccine. The story of immunization did not solely belong to smallpox. Jenner was certainly not the first to dip his toes into the pool of preventive immunity. Inoculation was not foreign, but the hesitancy among doctors lay in the fact that inoculated individuals still posed the risk of being carriers of a disease.

There has been some dispute as to who had initially postulated a correlation between cowpox infection and smallpox immunity, but the story goes that Jenner had observed a unique phenomenon among milkmaids. Noting that the vast majority of them were immune to smallpox, Jenner came up with an early hypothesis. Jenner suspected that contracting cowpox provided immunity to the milkmaids from developing smallpox. He further stipulated that the pus from the blisters they had acquired from having cowpox could be used to “vaccinate” against smallpox. This was revolutionary because cowpox was much less virulent than smallpox — with a significantly higher survival rate.

Edward Jenner | https://wellcomecollection.org/works/jpsdbu7d

Jenner was ready to test an idea that would become the building block of modern virology. In 1796, Jenner inoculated an early subject — James, the son of his gardener. Jenner took the pus he had scraped from the hands of Sarah Nelmes, a milkmaid who had previously contracted cowpox, and inoculated both arms of James with the virus. The gardener’s son developed a fever and had a general feeling of being unwell, but remained with mild symptoms. James was now immune to smallpox. The Latin word for cow comes from the root of vacca, Jenner called his method the “vaccination”. Jenner’s work highlighted the first controlled scientific attempt to eradicate an infectious illness through preventive immunity. He was not the first to discover the method — but he was the first to pursue scientific accreditation. The idea of vaccinations was generally well received in Europe at the time despite some initial opposition. Afterall, the method could be statistically observed and safer than contracting the deadly smallpox.

Experiments with animal tissue further the cause for vaccination

It was Louis Pasteur and his colleagues that outlined the clearest idea of attenuation, the practice of generating a less lethal strain of a virus to be used in immunization. Their idea was first exhibited in Pasteurella multocida, the cause of a diarrheal disease in chickens. The method Pasteur and colleagues followed was exposing the virus to heat and oxygen to “kill” the virus, rendering it harmless — a development that later played a large role in creating the rabies vaccine. The most notable cultivation of a virus in a lab for the purpose of attenuation was by Calmette and Guerin. In their study, they passed the tuberculosis bacteria hundreds of times through an artificially created medium to create an attenuated strain of the pathogen that could provide immunity against tuberculosis.

By the twentieth century, the virus responsible for yellow fever had been attenuated by passing it through the tissues of chicken and mice embryos. We now knew that passing viruses through the tissues of some non-human hosts provided attenuation. The vaccine for polio and rabies was developed by passing a virulent strain through the tissues of mice. A further breakthrough came in the form of the knowledge that cells could be grown in vitro to sustain the growth of viruses. A scientist could now grow the polio virus in a cell culture in their lab, a method that would radically aid in vaccine development. This wasn’t the only benefit. Growth of a virus in a media led them to mutate and these mutants lost the genes that allowed them to infect human hosts.

By the end of the 19th century, scientists knew that not all bacteria had to be alive to initiate a response by the body that led to immunity. Vaccines could be “inactive”- that is the bacteria was killed through heat or chemicals and the dead cells could be introduced as inoculation. This process would be safer for the host. The first inactive vaccines were created in the United States and France to combat pathogens such as typhoid and cholera. This launched a virology arms race between many European countries as each scrambled to develop antibacterial vaccines. Populations were immunized against the plague through inactive strains of the plague bacilli. In 1923, Glenny and Hopkins proved that it was possible to create a less toxic diphtheria vaccine through inactivation of toxic cells. The vaccine still allowed the immune system to generate toxin destroying antibodies. As we entered the 20th century, more bacteria were being used in their inactive state for immunization. The influenza vaccine became the first inactivated viral vaccine. Inactivation then became integral in generating vaccines for diseases such as polio and hepatitis A.

Further advances in the 20th century prompted a blast of immunizations that secured us against the whooping cough (1914), diphtheria (1926), tetanus (1938), flu (1945) and mumps (1948). Because of new assembling methods, immunization creation could be scaled up by the last part of the 1940s, setting worldwide inoculation campaigns.

Antibodies against polio (1955), measles (1963), rubella (1969) and other infections were added to the rundown, and overall immunization rates shot up significantly because of fruitful worldwide efforts. By 1980 we were completely free of smallpox, the first of numerous examples of overcoming viral adversity. There was still a long way to go to complete our understanding of infectious diseases.

Lineup to recieve oral polio vaccine in 1962 | CDC/Mr. Stafford Smith

The end of the 20th century marked the dawn of the age of genetic engineering. The only superstar to breakout from this method was the vaccine for hepatitis B. Genetically engineered vaccines are more costly to manufacture than your traditional vaccine, primarily due to the standard to which an antigen must be purified to. Subunit vaccines are made of the components of different pathogens — the reason for being referred to as “engineered”. A prime example of this technique was the hepatitis B vaccine (HBV). Researchers noticed that a surface protein of HBV led to protective immunity. The process of making the vaccine consisted of extracting the viral gene and then inserting it into a genome of yeast or mammalian cells. The cells that are subsequently produced can then be harvested to produce vaccines. Subunit vaccines have an inherent draw in that they are very safe due to the lack of a genome.

Where we stand today

Then came the mRNA vaccines. One disadvantage to mRNA antibodies is that they can separate at high temperatures, which is the reason the current vaccines for COVID-19 are put away at such chilly temperatures. One approach to make mRNA immunizations more steady is to add stabilizers and eliminate water from the antibody through a cycle called lyophilization, which has appeared to permit some mRNA immunizations to be put away in a fridge rather than a cooler.

Vaccination against long-standing diseases will keep on being significant in the many years and hundreds of years ahead, yet the work isn’t finished. To stop the spread of deadly illnesses, we need a way to screen new infections, and quickly create immunizations against the most risky arising contaminations. This past year was a reminder for how poorly prepared the world was to deal with a pandemic.

We’ve progressed significantly since the unsafe and frightful early immunization endeavors five centuries prior. Groundbreaking research, boundless worldwide efforts, and new open private associations are in a real sense lifelines. Finding a vaccine for COVID-19 has been a huge test. If the past has taught us one thing, it’s that our greatest discoveries are made in our most challenging moments.

In This Episode

No items found.