| Today's Chemist at Work | E-Mail Us | Electronic Readers Service |
|||||||||||
Introduction In the 1990s, the Human Genome Project took off like a rocket, along with many global genomic initiatives aimed at plants, animals, and microorganisms. Gene therapy developed, as did the potential of human cloning and the use of human tissues as medicine. The new technologies provided hope for solving the failures of the old through a new paradigm—one that was more complex, holistic, and individualized. The changing view became one of medicines and therapies based on knowledge, not trial and error; on human flesh, not nature’s pharmacy. The new therapeutic paradigm evolved from an earlier promise of hormone drugs and the flowering of intelligent drug design. It grew from an understanding of receptors and from breakthroughs in genomics and genetic engineering. It found favor in the new power of combinatorial chemistry and the modeling and data management abilities of the fastest computers. Hope was renewed for a next generation of magic bullets born of human genes. As the 1990s ended, many genetically based drugs were in clinical trials and a wealth of genome sequences, their promise as yet unknown, led scientists to reason that the 21st century would be the biotech century. Computers
and combinatorial technology
Combinatorial chemists produce libraries of chemicals based on the controlled and sequential modification of generally immobilized or otherwise tagged chemical starting blocks. These original moieties are chosen, under optimal knowledge conditions, for their predicted or possible behavior in a functional drug, protein, polymer, or pesticide. Developing the knowledge base for starting materials proved to be one of the greatest benefits of computer modeling of receptors and the development of computational libraries (multiple structures derived from computer algorithms that analyze and predict potentially useful sequences from databases of gene or protein sequences, or structural information from previous drugs). Here the search for natural products remained critical—for the discovery of new starting places. Finding new drug starting places, as much as finding drugs that were useful “as is,” became important in the 1990s as companies tried to tap into the knowledge base of traditional medical practitioners in the developing world through collaboration rather than appropriation. In this manner, several companies promoted the analysis of the biodiversity available in tropical rainforests and oceans. This “added value” of biodiversity became both the rallying cry for environmentalists and a point of solidarity for political muscle-building in the developing world. As the industrialized world’s demand for new drugs (and associated profits) increased, developing nations sought to prevent the perceived exploitation of their heritage. And just as previous sources of drugs from the natural world were not wholly ignored (even as the human genome became the “grail” of modern medical hopes), combinatorial approaches created a new demand for the services of traditional organic and analytical chemistry. Although computer models were beneficial, new wet chemistry techniques still had to be defined on the basis of the new discoveries in genomics and proteomics. They had to be modified for microsystems and mass production—for the triumph of the microtiter plate over the flask, the test tube, and the beaker. Drugs were still chemical products after all. High-throughput
screening
Researchers continued to move away from radioactivity in bioassays, automated sequencing, and synthesis by using various new tagging molecules to directly monitor cells, substrates, and reaction products by fluorescence or phosphorescence. Fluorescence made it possible to examine, for the first time, the behavior of single molecules or molecular species in in vivo systems. Roger Tsien and colleagues, for example, constructed variants of the standard green fluorescent protein for use in a calmodulin-based chimera to create a genetic-based, fluorescent indicator of Ca2+. They called this marker “yellow chameleon” and used it in transgenic Caenorhabditis elegans to follow calcium metabolism during muscle contraction in living organisms. Of particular promise was the use of such “light” technologies with DNA microarrays, which allowed quantitative analysis and comparison of gene expression by multicolor spectral imaging. Many genes are differentiated in their levels of expression, especially in cancerous versus normal cells, and microarray techniques showed promise in the discovery of those differentiated genes. Microarrays thus became a basic research tool and a highly promising candidate for HTS in drug development. Genomics
meets proteomics
Inspired by this new obsession with genomics, the 1990s may ultimately be best known for the production of the first complete genetic maps. The first full microorganism genome was sequenced in 1995 (Haemophilus influenza, by Craig Venter and colleagues at The Institute for Genomic Research). This achievement was followed rapidly by the genome sequencing of Saccharomyces cerevisiae (baker’s yeast) in 1996; Escherichia coli, Borrelia burgdorferi, and Helicobacter pylori in 1997; the nematode C. elegans in 1998; and the first sequenced human chromosome (22) in 1999. The entrance of industry into the race to sequence the human genome at the tail end of the decade sped up the worldwide effort, albeit amid controversy.
Hot on the heels of the genomics “blastoff” was the development of proteomics—the science of analyzing, predicting, and using the proteins produced from the genes and from the cellular processing performed on these macromolecules before they achieve full functionality in cells. Both proteomics and genomics rely on bioinformatics to be useful. Bioinformatics is essentially the computerized storage and analysis of biological data, from standard gene sequence databases (such as the online repository GenBank maintained by the NIH) to complex fuzzy logic systems such as GRAIL (developed in 1991 by Edward Eberbacher of Oak Ridge National Laboratory). GRAIL and more than a dozen other programs were used to find prospective genes in genomic databases such as GenBank by employing various pattern recognition techniques. Pattern recognition techniques were also at the heart of the new DNA microarrays discussed above, and they were increasingly used to detect comparative patterns of gene transcription in cells under various conditions and states (including diseased vs healthy). Human
biotechnology
Other attempts at gene therapy also remained more promising than successful. Clinical trials on humans were disappointing compared with the phenomenal successes in mice, although limited tumor suppression did occur in some cancers, and there were promising reports on the treatment of hemophilia. Jesse Gelsinger, a teenager who suffered from the life-threatening liver disorder ornithine transcarbamylase deficiency, volunteered for adenovirus-delivered gene therapy at a University of Pennsylvania clinical trial in 1999. His subsequent death sent a shock wave through the entire research community, exposed apparent flaws in regulatory protocols and compliance, and increased public distrust of one more aspect of harnessing genes. Using gene products as drugs, however, was a different story. From recombinant human insulin sold in the 1980s to the humanized antibodies of the 1990s, the successes of harnessing the human genome—whether “sensibly” (in the case of the gene product) or by using antisense techniques as inhibitors of human genes (in 1998 Formivirsen, used to treat cytomegalovirus, became the first approved antisense therapeutic)—proved tempting to the research laboratories of most major pharmaceutical companies. Many biotechnology medicines—from erythropoietin, tumor necrosis factor, dismutases, growth hormones, and interferons to interleukins and humanized monoclonal antibodies—entered clinical trials throughout the decade. Beginning in the 1990s, stem cell therapy held great promise. This treatment uses human cells to repair and ameliorate inborn or acquired medical conditions, from Parkinson’s disease and diabetes to traumatic spinal paralysis. By 1998, embryonic stem cells could be grown in vitro, which promised a wealth of new opportunities for this precious (and controversial) resource. Promising too were the new forms of tissue engineering for therapeutic purposes. Great strides were made in tissue, organ, and bone replacements. The demand for transplants, growing at 15% per year by the end of the decade, led to the search for appropriate artificial or animal substitutes. Cartilage repair systems, such as Carticel by Genzyme Tissue Repair, became commonplace. Patients’ cells shipped to the company were treated with Carticel, cultured, and subsequently reimplanted. Second-generation products permitted autologous cells to be cultured on membranes, allowing tissue formation in vitro. Several companies focused on developing orthobiologics—proteins, such as growth factors, that stimulate the patient’s ability to regenerate tissues.
Epicel, a graft made from autologous cells, was also developed by Genzyme Tissue Repair to replace the skin of burn victims with greater than 50% skin damage. In 1998, Organogenesis introduced the first FDA-approved, ready-to-order Apligraf human skin replacement, which was made of living human epidermal keratinocytes and dermal fibroblasts. Also undergoing research in 1999 was Vitrix soft tissue replacement—which was made of fibroblasts and collagen. By the end of the decade, artificial liver systems (which work outside the body) were developed as temporary blood cleansers providing detoxification and various digestion-related processes. In many cases, such treatments allowed the patient’s own liver to regenerate during the metabolic “rest.” Such uses of cells and tissues raised numerous ethical questions, which were galvanized in the media by the 1996 arrival of the clonal sheep Dolly. Reaction against the power of the new biotechnology was not restricted to fears of dehumanizing humanity through “xeroxing.” The possibility of routine xenotransplantation (using animal organs as replacements in humans) came to the fore with advances in immunology and genetic engineering that promised the ability to humanize animal tissues (specifically, those of pigs) in ways similar to the development of humanized antibodies in mice. The issue of xenotransplantation not only raised fears of new diseases creeping into the human population from animal donors, but was seen by some as a further degradation of human dignity either intrinsically or through the misuse of animals. The animal rights lobby throughout the decade argued passionately against the use of animals for human health purposes. The
Red Queen’s race
As the problem manifested, pharmaceutical, software, and instrument companies alike turned to combinatorial chemistry and HTS technologies in an effort to regain the racing edge. AIDS remained a profoundly disturbing example of the failure of technology to master a disease, despite the incredible advances in understanding its biology that took place in the 1990s. Vaccine efforts occupied much of the popular press, and genetically engineered viral fragments seemed to be the best hope. But the proliferation of viral strains erased hope for a single, easy form of vaccination. Resistance to AZT therapy increased, and even the new protease inhibitors and so-called drug cocktails developed in the 1990s proved to be only stopgap measures as viral strains appeared that were resistant to everything thrown at them. Individual lives were prolonged, and the death rate in Western countries, where the expensive new drugs were available, dropped precipitously. But the ultimate solution to aids had not been found, nor even a countermeasure to its spread in the developing world and among poor populations of industrialized nations. The middle of the decade saw a resurgence of the “old” plague (bubonic) in India, and even polio remained a problem in the developing world. In Africa, Ebola resurfaced in scattered outbreaks—although it was nothing compared with the continental devastation caused by aids. In the rest of the world, fears of biological warfare raised by the Gulf War continued. Vaccination efforts were stepped up for many diseases. The Assembly of the World Health Organization set a global goal in 1990 of a 95% reduction in measles deaths in 1995 compared with pre-immunization levels. By the deadline, estimated global coverage for measles vaccine had reached 78%, at the same time as the industrialized world experienced a backlash against vaccination because of concerns about adverse side effects. Vaccine technology continued to improve with the development of recombinant vaccines for several diseases, new efforts to produce vaccines for old scourges such as malaria, and new nasal delivery systems that stimulated the mucosal-associated antibody system. DNA vaccines—the injection of engineered plasmids into human cells to stimulate antigen production and immunization—were first described in a 1989 patent and published in 1990 by Wolff, Malone, Felgner, and colleagues. They entered clinical trials in 1995. Although one editor of Bio/Technology called this the Third Vaccine Revolution, by the end of the decade the reality of this expansive claim remained in doubt, especially because of the continuing debate over genetic engineering. Efforts to develop food-based vaccines through the production of transgenics engineered with the appropriate antigens continued. This research stimulated studies on mucosal immunity and efforts to enable complex proteins to cross the gut–blood barrier. Even with these technological developments, the decade ended with the negatives of ever-expanding disease problems, exacerbated by predictions that global warming would lead to new epidemics of insectborne and tropical diseases. However, a note of optimism remained that rational design and automated production technologies would ultimately be able to defeat these diseases. High
tech and new mech
In the 1990s, the use of mass spectrometry (MS) for bioanalytical analysis underwent a renaissance, with improvements such as ultrahigh-performance ms using Fourier-transform ion cyclotron resonance (FT-ICR MS) and tandem-in-time (multidimensional) MS for biological macromolecules. Improved techniques such as peak-parking (reducing the column flow rate into the mass spectrometer the instant a peak is detected, to allow samples to be split and analyzed by multiple MS nearly simultaneously) added several dimensions that were previously impossible. These changes improved the ability to analyze the complex mixtures required in studies of cellular metabolism and gene regulation. Results from multidimensional runs were analyzed by increasingly sophisticated bioinformatics programs and used to improve their knowledge base. In combination with HPLC and various capillary electrophoretic systems, ms became part of a paradigm for pharmaceutical R&D as a valuable new approach for identifying drug targets and protein function.
Equivalently, the development of multidimensional NMR techniques, especially those using more powerful instruments (e.g., 500-MHz NMR) opened the door to solving the structure of proteins and peptides in aqueous environments, as they exist in biological systems. The new NMR techniques allowed observations of the physical flexibility of proteins and the dynamics of their interactions with other molecules—a huge advantage in studies of a protein’s biochemical function, especially in receptors and their target molecules (including potential drugs). By viewing the computer-generated three-dimensional structure of the protein, which was made possible by the data gathered from these instruments, the way in which a ligand fits into a protein’s active site could be directly observed and studied for the first time. The three-dimensional structure provided information about biological function, including the catalysis of reaction and binding of molecules such as DNA, RNA, and other proteins. In drug design, ligand binding by a target protein was used to induce the ultimate effects of interest, such as cell growth or cell death. By using new technologies to study the structure of the target protein in a disease and learn about its active or ligand-binding sites, rational drug design sought to design inhibitors or activators that elicited a response. This correlation between structure and biological function (known as the structure–activity relationship, or SAR) became a fundamental underpinning of the revolution in bioinformatics. In the 1990s, the SAR was the basis by which genomics and proteomics were translated into pharmaceutical products. The
“new” Big Pharma
A new business atmosphere, first seen in the 1980s and institutionalized in the 1990s, revealed itself. It was characterized by mergers and takeovers, and by a dramatic increase in the use of contract research organizations—not only for clinical development, but even for basic R&D. Big Pharma confronted a new business climate and new regulations, born in part from dealing with world market forces and protests by activists in developing countries. Marketing changed dramatically in the 1990s, partly because of a new consumerism. The Internet made possible the direct purchase of medicines by drug consumers and of raw materials by drug producers, transforming the nature of business. Direct-to-consumer advertising proliferated on radio and TV because of new FDA regulations in 1997 that liberalized requirements for the presentation of risks of medications on electronic media compared with print. The phenomenal demand for nutritional supplements and so-called alternative medicines created both new opportunities and increased competition in the industry—which led to major scandals in vitamin price-fixing among some of the most respected, or at least some of the biggest, drug corporations. So powerful was the new consumer demand that it represented one of the few times in recent history that the burgeoning power of the FDA was thwarted when the agency attempted to control nutritional supplements as drugs. (The FDA retained the right to regulate them as foods.) Promise
and peril
In the 19th century, the natural world was the source of most medicines. With the dawn of magic bullets in the 20th century, complex organic chemistry opened up a world of drugs created in the laboratory, either modified from nature or synthesized de novo. As the Pharmaceutical Century progressed, from the first knowledge of human hormones to the discovery of the nature of genes and the tools for genetic engineering, the modern paradigm saw a recasting of what human flesh was for—suddenly it was a source of medicines, tissues, and patentable knowledge. Humankind, not the natural world, became the hoped-for premier source of drug discovery.
With genes and chemicals suddenly capable of manipulating the warp and woof of the human loom—both mind and body alike—the human pattern seemed to become fluid to design. According to pessimists, even if biotechnology were not abused to create “superhumans,” pharmaceuticals and health care could become the greatest differentiators of human groups in history—not by genetic races, but by economic factors. The new knowledge was found in the highest and most expensive technologies, and often in the hands of those more interested in patents than panaceas. Yet with this peril of inequality comes the promise of human transformation for good. According to optimists, biotechnology—especially the use of transgenic plants and animals for the production of new drugs and vaccines, xenotransplantation, and the like—promises cheaper, more universal health care. Meanwhile, lifestyle drugs—pharmaceuticals for nonacute conditions such as sterility, impotence, and baldness—also have emerged as a fast-growing category. From aspirin to Herceptin—a monoclonal antibody that blocks the overexpressed Her2 receptor in breast cancer patients—from herbal medicines to transgenic plants, from horse serum to xenotransplantation, from animal insulin to recombinant human growth hormone, the Pharmaceutical Century was one of transformation. It is too soon to predict what pattern the process will weave, but the human loom has gone high tech. The 21st century will be a brave new tapestry. |
|||||||||||
© 2000 American Chemical Society |