ACS Publications Division - to list of Journals
Pharmaceutical Centuryto Pharmaceutical Century home
Analytical Chemistry | Chemical & Engineering News | Modern Drug Discovery
| Today's Chemist at Work | E-Mail Us | Electronic Readers Service 
1800s to 1919
opening art
Corporate Sponsors
Agilent Technologies
Discovery Partners
EM Science
Fisher Chemical
Fresenius Kabi
Thomas Scientific
Patents & Potions

Introduction  We live today in a world of drugs. Drugs for pain, drugs for disease, drugs for allergies, drugs for pleasure, and drugs for mental health. Drugs that have been rationally designed; drugs that have been synthesized in the factory or purified from nature. Drugs fermented and drugs engineered. Drugs that have been clinically tested. Drugs that, for the most part, actually do what they are supposed to. Effectively. Safely.

By no means was it always so.

Before the end of the 19th century, medicines were concocted with a mixture of empiricism and prayer. Trial and error, inherited lore, or mystical theories were the basis of the world’s pharmacopoeias. The technology of making drugs was crude at best: Tinctures, poultices, soups, and teas were made with water- or alcohol-based extracts of freshly ground or dried herbs or animal products such as bone, fat, or even pearls, and sometimes from minerals best left in the ground—mercury among the favored. The difference between a poison and a medicine was a hazy differentiation at best: In the 16th century, Paracelsus declared that the only difference between a medicine and a poison was in the dose. All medicines were toxic. It was cure or kill.

“Rational treatments” and “rational drug design” of the era were based on either the doctrine of humors (a pseudoastrological form of alchemical medicine oriented to the fluids of the body: blood, phlegm and black and yellow bile) or the doctrine of signatures. (If a plant looks like a particular body part, it must be designed by nature to influence that part. Lungwort, for example, was considered good for lung complaints by theorists of the time because of its lobe-shaped leaves.) Neither theory, as might be expected, guaranteed much chance of a cure.

Doctors and medicines were popular, despite their failures. As pointed out by noted medical historian Charles E.Rosenberg, a good bedside manner and a dose of something soothing (or even nasty) reassured the patient that something was being done, that the disease was not being ignored.

Blood and mercury
By the first part of the 19th century, the roots of modern pharmacy had taken hold with a wave of heroic medicine. Diseases were identified by symptom, and attacking the symptom as vigorously as possible was the high road to health.

Bloodletting dominated the surgeon’s art, and dosing patients with powerful purgatives and cathartics became the order of the day in an attempt to match the power of the disease with the power of the drug. Bleed them till they faint. (It is difficult to sustain a raging fever or pounding pulse when there is too little blood in the body, so the symptoms, if not what we would call the disease, seemed to vanish.) Dose them with calomel till they drool and vomit. (Animals were thought to naturally expel toxins this way.) Cleanse both stomach and bowels violently to remove the poisons there.

Certainly these methods were neither pleasant nor very effective at curing patients already weakened by disease. George Washington died in misery from bloodletting; Abraham Lincoln suffered chronic mercury poisoning and crippling constipation from his constant doses of “blue mass.” The “cure” was, all too often, worse than the disease.

In the second half of the 19th century, things changed remarkably as the industrial revolution brought technological development to manufacturing and agriculture and inspired the development of medical technology.

Spurred in part by a reaction against doctors and their toxic nostrums, patent medicines and in particular homeopathy (which used extreme dilutions of otherwise toxic compounds) became popular and provided an “antidote” to the heroic treatments of the past. Not helpful, but at least harmless for the most part, these new drugs became the foundation of a commodity-based medicine industry that galvanized pharmacist and consumer alike. Technology entered in the form of pill and powder and potion making.

Almost by accident, a few authentic drugs based on the wisdom and herbal lore of the past were developed: quinine, digitalis, and cocaine. Ultimately, these successes launched the truly modern era. The century ended with the development of the first of two synthesized drugs that represent the triumph of chemistry over folklore and technology over cookery. The development of antipyrine in 1883 and aspirin in 1897 set the stage for the next 10 decades of what we can look back on in retrospect as the Pharmaceutical Century. With new knowledge of microbial pathogens and the burgeoning wisdom of vaccine technology, the first tentative steps were taken to transform medicines to a truly scientific foundation.

From these scattered seeds, drug technology experienced remarkable if chaotic growth in the first two decades of the 20th century, a period that can be likened to a weedy flowering of quackery and patent medicines twining about a hardening strand of authentic science and institutions to protect and nourish it.

Staging the Pharmaceutical Century
In the latter half of the 19th century, numerous beneficent botanicals took center stage in the world’s pharmacopoeias. Cocaine was first extracted from coca leaves in 1860; salicylic acid—the forerunner of aspirin—was extracted from willow bark in 1874 for use as a painkiller. Quinine and other alkaloids had long been extracted from China bark; but an antifebrile subcomponent, quinoline, was not synthesized in the lab until 1883 by Ludwig Knorr. The first truly synthetic pain reliever, antipyrine, was produced from quinoline derivatives. Digitalis from foxglove and strophantin from an African dogbane were both botanicals purified for use against heart disease. The opium poppy provided a wealth of pain relievers: opium, morphine, codeine, and heroin.

But it was not until the birth of medical microbiology that the true breakthroughs occurred, and science—rather than empiricism—took center stage in the development of pharmaceuticals.

Murderous microbes
The hallmark of 19th-century medicine has to be the microbial theory of disease. The idea that infectious diseases were caused by microscopic living agents provided an understanding of the causes and the potential cures for ills from anthrax to whooping cough.

Technology made the new framework possible. The brilliance of European lens makers and microscopists, coupled with the tinkering of laboratory scientists who developed the technologies of sterilization and the media and methods for growing and staining microbes, provided the foundation of the new medical science that would explode in the 20th century. These technologies offered proof and intelligence concerning the foe against which pharmaceuticals, seen thereafter as weapons of war, could be tested and ultimately designed.

In 1861, the same year that the American Civil War began, Ignaz Semmel weis published his research on the transmissible nature of purperal (childbed) fever. His theories of antisepsis were at first vilified by doctors who could not believe their unwashed hands could transfer disease from corpses or dying patients to healthy women. But eventually, with the work of Robert Koch, Joseph Lister, and Louis Pasteur adding proof of the existence and disease-causing abilities of microorganisms, a worldwide search for the microbial villains of a host of historically deadly diseases began.

In 1879, as part of the new “technology,” Bacterium coli was discovered (it was renamed Escherichia after its discoverer, Theodor Escherich, in 1919). It quickly became the quintessential example of an easily grown, “safe” bacteria for laboratory practice. New growth media, new sterile techniques, and new means of isolating and staining bacteria rapidly developed. The ability to grow “pathogens” in culture proved remarkably useful. Working with pure cultures of the diphtheria bacillus in Pasteur’s laboratory in 1888, Emile Roux and Alexandre Yersin first isolated the deadly toxin that causes most of diphtheria’s lethal effects.

One by one over the next several decades, various diseases revealed their microbial culprits to the so-called microbe-hunters.

Initially, most American physicians were loath to buy into germ theory, seeing it as a European phenomenon incompatible with the “truth” of spontaneous generation and as a threat to the general practitioner from the growing cadre of scientifically trained laboratory microbiologists and specialist physicians.

“Anti-contagionists” such as the flamboyant Colonel George E. Waring Jr., pamphleteer, consulting engineer, and phenomenally effective warrior in the sanitation movement, ultimately held sway. Filth was considered the source of disease. A host of sewage projects, street-cleaning regimens, and clean water systems swept urban areas across the United States, with obvious benefits. Ultimately, the germ theory of infectious diseases had to be accepted, especially as the theoretical foundation behind the success of the sanitation movement. And with the production of vaccines and antitoxins, older medical frameworks fell by the wayside, though rural American physicians were still recommending bleeding and purgatives as cures well into the first few decades of the 20th century. 
SOCIETY: Muckraking and medicine
Media “muckrakers” exposed the seedy underbelly of robber baron capitalism and helped transform American society and government with a wave of reform. They were particularly effective in the areas of medicine and food, especially with the 1905 Collier’s series, “The Great American Fraud,” and Upton Sinclair’s 1906 novel of the Chicago stockyards, The Jungle. The muckrakers revealed patent medicines laced with addictive drugs, toxic additives, and—horror-of-horrors to teetotaling conservatives— alcohol; and sausages laced with offal, sawdust, and excrement.

These works galvanized the American public (and a disgusted President Teddy Roosevelt) to demand government regulation rather than the prevailing laissez-faire mentality.

Victorious vaccines
The most significant outgrowth of the new germ theory, and the one that created the greatest demand for new technologies for implementation, was the identification and production of the new “immunologicals”—drugs that are, in essence, partially purified components or fractions of animal blood. In 1885, Pasteur developed attenuated rabies vaccine—a safe source of “active” immunity (immunity developed against a form or component of the disease-causing microorganism by the body’s own immune system). Vaccines would be developed against a variety of microorganisms in rapid succession over the next several decades.

But active immunity was perhaps not the most impressive result of the immunologicals. Antitoxins (antibodies isolated against disease organisms and their toxins from treated animals), when injected into infected individuals, provided salvation from otherwise fatal diseases. This technology began in 1890 when Emil von Behring and Shibasaburo Kitasato isolated the first antibodies against tetanus and, soon after, diphtheria. In 1892, Hoechst Pharma developed a tuberculin antitoxin. These vaccines and antitoxins would form the basis of a new pharmaceutical industry.

Perhaps as important as the development of these new immunologics was the impetus toward standardization and testing that a new generation of scientist-practitioners such as Koch and Pasteur inspired. These scientists’ credibility and success rested upon stringent control—and ultimately, government regulation—of the new medicines. Several major institutions sprang up in Europe and the United States to manufacture and/or inspect in bulk the high volume of vaccines and antitoxins demanded by a desperate public suddenly promised new hope against lethal diseases. These early controls helped provide a bulwark against contamination and abuse. Such control would not be available to the new synthetics soon to dominate the scene with the dawn of “scientific” chemotherapy.

Medicinal chemistry
Parallel (and eventually linked) to developments in biology, the chemist’s art precipitously entered the medicinal arena in 1856 when Englishman William Perkin, in an abortive attempt to synthesize quinine, stumbled upon mauve, the first synthesized coal tar dye. This discovery led to the development of many synthetic dyes but also to the realization that some of these dyes had therapeutic effects. Synthetic dyes, and especially their medicinal “side effects,” helped put Germany and Switzerland in the forefront of both organic chemistry and synthesized drugs. The dye–drug connection was a two-way street: The antifever drug Antifebrin, for example, was derived from aniline dye in 1886.

The chemical technology of organic synthesis and analysis seemed to offer for the first time the potential to scientifically ground the healer’s art in a way far different from the “cookery” of ancient practitioners. In 1887, phenacetin, a pain reliever, was developed by Bayer specifically from synthetic drug discovery research. The drug eventually fell into disfavor because of its side effect of kidney damage. Ten years later, also at Bayer, Felix Hoffman synthesized acetylsalicylic acid (aspirin). First marketed in 1899, aspirin has remained the most widely used of all the synthetics.

Many other new technologies also enhanced the possibilities for drug development and delivery. The advent of the clinical thermometer in 1870 spearheaded standardized testing and the development of the antifever drugs. In 1872, Wyeth invented the rotary tablet press, which was critical to the mass marketing of drugs. By 1883, a factory was producing the first commercial drug (antipyrine) in a ready-dosaged, prepackaged form. With the discovery of X-rays in 1895, the first step was taken toward X-ray crystallography, which would become the ultimate arbiter of complex molecular structure, including proteins and dna.

The Pharmaceutical Century
Not only did the early 1900s bring the triumph of aspirin as an inexpensive and universal pain reliever—the first of its kind—but the science of medicine exploded with a new understanding of the human body and its systems. Although not immediately translated into drugs, these discoveries would rapidly lead to a host of new pharmaceuticals and a new appreciation of nutrition as a biochemical process and hence a potential source of drugs and drug intervention.

Of equal if not more importance to the adoption and implementation of the new technologies was the rise of public indignation—a demand for safety in food and medicines that began in Europe and rapidly spread to the United States.

Tainted food and public furor
The developing understanding of germ theory and the increasing availability of immunologics and chemical nostrums forced recognition that sanitation and standardization were necessary for public health and safety. First in Europe, and then in the United States, the new technologies led to the growth of new public and semipublic institutions dedicated to producing and/or assaying the effectiveness and safety of pharmaceuticals and foods in addition to those dedicated to sanitation and public disease control. Unfortunately, the prevalence of disease among the poor created a new line of prejudice against these presumed “unsanitary” subclasses.

In the United States, where the popular sanitation movement could now be grounded in germ theory, this fear of contagion manifested among the developing middle classes was directed especially against immigrants—called “human garbage” by pundits such as American social critic Herbert George in 1883. This led to the Immigration Act of 1891, which mandated physical inspection of immigrants for diseases of mind and body—any number of which could be considered cause for quarantine or exclusion. Also in 1891, the Hygienic Laboratory (founded in 1887 and the forerunner of the National Institutes of Health) moved from Staten Island (New York City) to Washington, DC—a sign of its growing importance.

That same year, the first International Sanitary Convention was established. Although restricted to efforts to control and prevent cholera, it would provide a model of things to come in the public health arena. In 1902, an International Sanitary Bureau (later renamed the Pan American Sanitary Bureau and then the Pan American Sanitary Organization) was established in Washington, DC, and became the forerunner of today’s Pan American Health Organization, which also serves as the World Health Organization’s Regional Office for the Americas.

Fears of contagion on the one hand and poisoning on the other, resulting from improperly prepared or stored medicines, led to the 1902 Biologicals Controls Act, which regulates the interstate sale of viruses, serums, antitoxins, and similar products.

One of the significant outgrowths of the new “progressive” approach to solving public health problems with technological expertise and government intervention was the popularity and influence of a new class of journalists known as the Muckrakers. Under their impetus, and as the result of numerous health scandals, The 1906 U.S. Pure Food and Drugs Act, after years of planning by U.S. Department of Agriculture (USDA) researchers such as John Wiley, was passed easily. The act established the USDA’s Bureau of Chemistry as the regulatory agency. Unfortunately, the act gave the federal government only limited powers of inspection and control over the industry. Many patent medicines survived this first round of regulation.

The American Medical Association (AMA) created a Council on Pharmacy and Chemistry to examine the issue and then established a chemistry laboratory to lead the attack on the trade in patent medicines that the Pure Food and Drugs Act had failed to curb. The ama also published New and Nonofficial Remedies annually in an effort to control drugs by highlighting serious issues of safety and inefficacy. This publication prompted rapid changes in industry standards. 
TECHNOLOGY: Enter the gene
The Pharmaceutical Century ended on a wave of genetic breakthroughs—from the Human Genome Project to isolated genes for cancer. And so did it begin, relatively ineffectively at first, at least with regard to the development of medicines. 

In 1908, A. E. Garrod described “inborn errors of metabolism” based on his analysis of family medical histories—a major breakthrough in human genetics and the first recognized role of biochemistry in heredity. In 1909, Wilhelm Johannsen coined the terms “gene,” “genotype,” and “phenotype.” In 1915, bacterio phages were discovered. First thought to be another “magic bullet,” their failure as routine therapeutics became secondary to their use in the study of bacterial genetics. By 1917, Richard Goldschmidt suggested that genes are enzymes and, by doing so, fully embraced the biochemical vision of life.

International health procedures continued to be formalized also—L’Office International d’Hygiène Publique (OIHP) was established in Paris in 1907, with a permanent secretariat and a permanent committee of senior public health officials. Military and geopolitical concerns would also dominate world health issues. In 1906, the Yellow Fever Commission was established in Panama to help with U.S. efforts to build the canal; in 1909, the U.S. Army began mass vaccination against typhoid.

Nongovernmental organizations also rallied to the cause of medical progress and reform. In 1904, for example, the U.S. National Tuberculosis Society was founded (based on earlier European models) to promote research and social change. It was one of many groups that throughout the 20th century were responsible for much of the demand for new medical technologies to treat individual diseases. Grassroots movements such as these flourished. Public support was often behind the causes. In 1907, Red Cross volunteer Emily Bissell designed the first U.S. Christmas Seals (the idea started in Denmark). The successful campaign provided income to the Tuberculosis Society and a reminder to the general public of the importance of medical care. Increased public awareness of diseases and new technologies such as vaccination, antitoxins, and later, “magic bullets,” enhanced a general public hunger for new cures.

The movement into medicine of government and semipublic organizations such as the AMA and the Tuberculosis Society throughout the latter half of the 19th and beginning of the 20th centuries set the stage for a new kind of medicine that was regulated, tested, and “public.” Combined with developments in technology and analysis that made regulation possible, public scrutiny slowly forced medicine to come out from behind the veil of secret nostrums and alchemical mysteries.

The crowning of chemistry
It was not easy for organized science, especially chemistry, to take hold in the pharmaceutical realm. Breakthroughs in organic synthesis and analysis had to be matched with developments in biochemistry, enzymology, and general biology. Finally, new medicines could be tested for efficacy in a controlled fashion using new technologies—laboratory animals, bacterial cultures, chemical analysis, clinical thermometers, and clinical trials, to name a few. Old medicines could be debunked using the same methods—with public and nongovernmental organizations such as the ama providing impetus. At long last, the scientific community began to break through the fog of invalid information and medical chicanery to attempt to create a new pharmacy of pharmaceuticals based on chemistry, not caprice.

The flowering of biochemistry in the early part of the new century was key, especially as it related to human nutrition, anatomy, and disease. Some critical breakthroughs in metabolic medicine had been made in the 1890s, but they were exceptions rather than regular occurrences. In 1891, myedema was treated with sheep thyroid injections. This was the first proof that animal gland solutions could benefit humans. In 1896, Addison’s disease was treated with chopped up adrenal glands from a pig. These test treatments provided the starting point for all hormone research. Also in 1891, a pair of agricultural scientists developed the Atwater–Rosa calorimeter for large animals. Ultimately, it provided critical baselines for human and animal nutrition studies.

But it wasn’t until the turn of the century that metabolic and nutritional studies truly took off. In 1900, Karl Landsteiner discovered the first human blood groups: O, A, and B. That same year, Frederick Hopkins discovered tryptophan and demonstrated in rat experiments that it was an “essential” amino acid—the first discovered. In 1901, fats were artificially hydrogenated for storage for the first time (providing a future century of heart disease risk). Eugene L. Opie discovered the relationship of islets of Langerhans to diabetes mellitus, thus providing the necessary prelude to the discovery of insulin. Japanese chemist Jokichi Takamine isolated pure epinephrine (adrenaline). And E. Wildiers discovered “a new substance indispensable for the development of yeast.” Growth substances such as this eventually became known as vitamines and later, vitamins.

In 1902, proteins were first shown to be polypeptides, and the AB blood group was discovered. In 1904, the first organic coenzyme—cozymase—was discovered. In 1905, allergies were first described as a reaction to foreign proteins by Clemens von Pirquet, and the word “hormone” was coined. In 1906, Mikhail Tswett developed the all-important technique of column chromatography. In 1907, Ross Harrison developed the first animal cell culture using frog embryo tissues. In 1908, the first biological audioradiograph was made—of a frog. In 1909, Harvey Cushing demonstrated the link of pituitary hormone to giantism. 
BIOGRAPHY: Paul Ehrlich
Ehrlich’s breakthrough research originally began from his study of coal tar dyes. Their properties of differentially staining biological material led him to question the relationship of chemical structures to patterns of distribution and affinities for living cells. He expanded this theoretical framework (his side-chain theory of cell function) to include immunology and chemotherapy. Ehrlich believed strongly in the necessity of in vivo testing. Using the arsenical compound atoxyl, which British researchers had discovered was effective against trypanosomes (but which also damaged the optic nerve of the patient), Ehrlich modified the chemical side chains in an attempt to preserve its therapeutic effect while eliminating its toxicity. This “rational” approach led to compound 606 in 1909. Tradenamed Salvarsan, it was the first “magic bullet.” 

Almost immediately after Svante August Arrhenius and Soren Sorensen demonstrated in 1909 that pH could be measured, Sorenson pointed out that pH can affect enzymes. This discovery was a critical step in the development of a biochemical model of metabolism and kinetics. So many breakthroughs of medical significance occurred in organic chemistry and biochemistry in the first decade of the Pharmaceutical Century that no list can do more than scratch the surface.

Making magic bullets
It was not the nascent field of genetics, but rather a maturing chemistry that would launch the most significant early triumph of the Pharmaceutical Century. Paul Ehrlich first came up with the magic bullet concept in 1906. (Significant to the first magic bullet’s ultimate use, in this same year, August von Wasserman developed his syphilis test only a year after the bacterial cause was determined.) However, it wasn’t until 1910 that Ehrlich’s arsenic compound 606, marketed by Hoechst as Salvarsan, became the first effective treatment for syphilis. It was the birth of chemotherapy.

With the cure identified and the public increasingly aware of the subject, it was not surprising that the “progressive” U.S. government intervened in the public health issue of venereal disease. The Charmerlain–Kahn Act of 1918 provided the first federal funding specifically designated for controlling venereal disease. It should also not be a surprise that this attack on venereal disease came in the midst of a major war. Similar campaigns would be remounted in the 1940s.

The “fall” of chemotherapy
Salvarsan provided both the promise and the peril of chemotherapy. The arsenicals, unlike the immunologicals, were not rigidly controlled and were far more subject to misprescription and misuse. (They had to be administered in an era when injection meant opening a vein and percolating the solution into the bloodstream through glass or rubber tubes.) The problems were almost insurmountable, especially for rural practitioners. The toxicity of these therapeutics and the dangers associated with using them became their downfall. Most clinicians of the time thought the future was in immunotherapy rather than chemotherapy, and it wasn’t until the antibiotic revolution of the 1940s that the balance would shift.

Ultimately, despite the manifold breakthroughs in biochemistry and medicine, the end of the ’Teens was not a particularly good time for medicine. The influenza pandemic of 1918–1920 clearly demonstrated the inability of medical science to stand up against disease. More than 20 million people worldwide were killed by a flu that attacked not the old and frail but the young and strong. This was a disease that no magic bullet could cure and no government could stamp out. Both war and pestilence set the stage for the Roaring Twenties, when many people were inclined to “eat, drink, and make merry” as if to celebrate the optimism of a world ostensibly at peace. 
Suggested reading
  • Two Centuries of American Medicine: 1776–1976

  • Bordley, J., III; Harvey, A. M. (W. B. Saunders Co.: Philadelphia, 1976)
  • Healing Drugs: The History of Pharmacology

  • Facklam, H.; Facklam, M. (Facts on File, Inc.: New York, 1992)
  • Other Healers: Unorthodox Medicine in America

  • Gevitz, N., Ed. (Johns Hopkins University Press: Baltimore, 1988)
  • The Greatest Benefit to Mankind: A Medical History of Humanity

  • Porter, R. (W. W. Norton & Co.: New York, 1997) 
  • In Search of a Cure: A History of Pharmaceutical Discovery

  • Weatherall, M. (Oxford University Press: New York, 1990)

Still, a burgeoning science of medicine promised a world of wonders yet to come. Technological optimism and industrial expansion provided an antidote to the malaise caused by failed promises revealed in the first two decades of the new century.

But even these promises were suspect as the Progressive Era drew to a close. Monopoly capitalism and renewed conservatism battled against government intervention in health care as much as the economy did and became a familiar refrain. The continued explosive growth of cities obviated many of the earlier benefits in sanitation and hygiene with a host of new “imported” diseases. The constitutive bad health and nutrition of both the urban and rural poor around the world grew worse with the economic fallout of the war.

Many people were convinced that things would only get worse before they got better.

The Pharmaceutical Century had barely begun.

© 2000 American Chemical Society
Analytical Chemistry Chemical and Engineering News Modern Drug Discovery Today's Chemist at Work
CASChemPortChemCenterPubs Page