Today, chemo and cancer seem to go together like tea and honey. While not everyone with cancer will need or receive chemotherapy, it’s one of the most commonly used treatments—either on its own or in combination with other therapies to make them more effective.
The irony is, chemotherapy wasn’t originally meant to treat cancer—so how did it become the cancer treatment mainstay it is today?
The 1900s: Chemo Gets Its Start
German chemist Paul Ehrlich coined the name “chemotherapy” (“chemo” as in “chemical”) in the early 1900s, but he wasn’t hunting for a cure for cancer. Instead, he was trying to find drugs that could fight the infectious diseases that were haunting society at the time: pneumonia, flu, tuberculosis, diphtheria, etc.
As Ehrlich defined it, chemotherapy simply referred to chemicals to treat disease. The cancer-fighting potential was only discovered later—and by accident.
The 1940s: Could Chemo Help Fight Cancer?
During World War II, members of the navy were exposed to mustard gas, a chemical warfare agent that causes irritation to the skin, eyes, and respiratory tract. But that’s not all: It was later discovered that these navy personnel also experienced toxic changes to their bone marrow cells.
Shortly after, it was the U.S. Army who found that the compound nitrogen mustard actually worked against lymphoma, a cancer of the lymph nodes. It was able to lead to remission for many lymphoma patients; however, it didn’t create a durable cure, and as a result, the public was pessimistic of chemo at first.
Chemo got another boost in 1947, when Boston pediatrician Sidney Farber discovered the compound aminopterin. At the time, this was the first effective treatment against childhood leukemia, which had previously been a disease that was almost inevitably fatal.
Until this point, cancer was mainly treated with surgery and radiation. Unfortunately, surgery and radiation only worked for cancers that were local—that is, cancers that hadn’t spread beyond the initial tumor site. But by the 1950s, thanks to discoveries from people like Farber, chemo was officially a potential cancer treatment.
The ‘50s Onwards: The Chemo Era
Despite the major limitations of surgery and radiation, the idea of using drugs to fight cancer was considered strange to some, and many patients and even doctors continued to have hostile attitudes toward chemotherapy. Doctors who considered themselves “chemotherapists” were often ridiculed and mistrusted by their colleagues.
But chemo advocates weren’t about to give up. After all, chemo appeared to be better at hunting down cancer cells that had migrated throughout the body—a stage of cancer known as metastasis.
In the 1970s, chemotherapy researchers delivered the proof they needed. Using chemo, they were able to achieve 80 percent remission rates for patients with Hodgkin’s disease, a type of lymphoma.
By 1973, medical oncology was established as a clinical field, and chemotherapy was considered one of its main weapons. This led to increased research and development, and the practice of chemotherapy became more refined.
Adjuvant chemotherapy was one development during this time period. “Adjuvant” simply refers to a therapy given after the initial or main treatment, so adjuvant chemo is the practice of giving chemotherapy after (usually) surgery, to ensure any remaining cancer cells in the body are eradicated.
Combination chemotherapy was another development. By blending different chemo agents, doctors realized they were better able to treat specific types of cancer.
Later, researchers focused on the side effects. Chemotherapy has long been known as a coarse treatment option, causing an array of chemo side effects. A number of supplemental medications were introduced that could reduce the side effects, such as antiemetics for nausea and vomiting.
And today, oncologists have even more targeted chemo drugs at their disposal. These targeted chemo agents are able to treat the cancer while causing fewer side effects altogether.
Want more info on chemotherapy today?