본문으로 바로가기

산업동향

Drug Discovery at the Dawn of the New Millennium-Technology Trends Past, Present, and Future

  • 등록일2000-12-20
  • 조회수8871
  • 분류산업동향 > 제품 > 바이오의약
  • 자료발간일
    2000-12-20
  • 출처
    BIO
  • 원문링크
  • 키워드
    #Millennium-Technology

Drug Discovery at the Dawn of the New Millennium-Technology
Trends Past, Present, and Future
The past several years have witnessed a paradigm shift in the drug discovery process. The
traditional process of discovery involving few targets and some number of chemical structures
against which the targets were screened has proved inadequate in terms of generating the quality
and quantity of lead com-pounds that have the potential of becoming blockbuster drugs.


The pharmaceutical industry has begun to adopt a smorgasbord of technologies to achieve better,
faster, and cheaper routes of drug discovery and development. The result of this shift in the
approach companies are taking has been the expansion of the biotechnology industry. This industry
has molded itself to serve as a spi got of technologies for the pharmaceutical community, providing
novel technologies that can be implemented by bigger drug companies. In this manner, the two
industries have a symbiotic existence.


This article, which was written by industry analyst Enal S. Razvi, Ph.D. (erazvi@ljlbio.com),
explores some key elements of the drug discovery process and the technologies that are being
implemented to address the bottlenecks (both throughput and monetary) in these areas. We believe
that the action interms of technologies for drug discovery in the new millennium will upon
three areas: (1) screening technologies, (2) genotyping technologies/gene expression technologies,
and (3) protein characterization/expression technologies.



★ Introduction


The two paradigms in drug discovery are best represented in Figures 3 and 4. Figure 3 presents the
old paradigm of discovery, and the new paradigm, driven by the discovery of new targets via
genomics and new compounds (preludes to drugs) via combinatorial chemistry, is presented in Figure
4. The shift that has taken place in terms of how companies are approaching drug discovery and
development was initiated by one serious bottleneck: the discovery and characterization of novel
genetic targets [i.e., molecules (mostly proteins) that are associated with disease
processes].Much of this shift in the industry’s approach can be attributed to the realization
that genes play an important role in the precipitation of disease. As such, there has been a flurry
of activity in the field of genomics (analyzing genes en masse), a topic addressed in more depth in
the companion article on page 44 titled Genomics: A Decade of Discovery and a New Perspective for
Therapeutic Development. Genomics has the power to generate targets in significantly greater
numbers than has been possible via traditional approaches. These targets are associated with a
specific disease or group of diseases in question, thus making the challenge identifying small
molecules that are capable of pharmacologically modulating the target.


Combinatorial chemistry has been the discipline that seeks to provide diverse enough compounds to
cover the entire chemical space. The expansion of combinatorial chemistry as a way to feed the
pipeline created by genomics is another element that has induced the in the paradigm of drug
discovery. There are a number of features associated with thenew paradigm. The genomics ensemble
and novel chemistries are driving the need for high-throughput screening (HTS) and ultra high-
throughput screening(UHTS). Single nucleotide polymorphisms (SNPs) are contributing to the
discovery of new genomic targets and are also involved later on in the process at the lead compound
evaluation and marketing stage. The new concept of pharmacogenomics involves the stratification
of patients according to lesion(s) that precipitate disease [as opposed to the traditional approach
of treating whole patient populations ashomogenous], and SNPs as a molecular signature of the
genome are gaining wide popularity. High-throughput technologies are being developed in order to
cost-effectively genotype large numbers of individuals across their various SNPs in an attempt
todefine associations of phenotype (disease) with molecular signatures in the genome.Given the
above backdrop traditional mode of drug discovery (then) and the new paradigm for drug discovery
(now)?we now examine some of the key technologies in drug discovery that are driving this
paradigm forward.



★ Screening/Miniaturization Technologies Ensemble


HTS and UHTS in drug discovery. In order to perform effective HTS, yielding information that can be
used to identify leads, the need exists for robust assays that can translate biological reactions
into a suitable readout. For this reason, a large number o fassay readouts are employed, exploiting
different physicochemical properties. The first level of segmentation occurs when radioactive
vs.non radioactive assays are chosen. Radioactive assays have been used for a number of years, but
they posevarious issues such as waste disposal and exposure to radioactivity. In recent years, the
use of fluorescence-based readouts has gained wide acceptance. In fact, a large number of assays
currently practiced in both low-throughput and high-throughput formats (for drug discovery
purposes) are based upon simple fluorescence measurements or variants thereof. These variants may
be fluorescence resonance energy transfer (FRET), fluorescence polarization (FP), time-resolved
fluorescence (TRF), or fluorescence lifetime measurements.


As the numbers of samples screened in the pharmaceutical community grow as a result of the
increasing numbers of genomics-based targets and combinatorial libraries, so does the drive to move
toward faster assay formats. For this reason, the pharmaceutical community is moving toward the use
of homogeneous, or mix and measure, assay formats, as opposed to heterogeneous assays (used in the
past), which involve multiple washing/mixing steps. Table 14 presents the issues involved in
homogeneous versus heterogeneous assay formats and demonstrates the benefits of the homogeneous
format.


HTS technologies. An overview of TRF and FP is presented here, since these two screening
technologies are homogeneous and their acceptance is predicted to expand in the screening space
going forward.


SPA™ and HTRF/LANCE both rely on the biological interaction between a pair of molecules, for
example, a receptor-ligand pair. Upon this interaction, they are brought into close proximity on a
molecular scale. In SPA, which is a radioactivity-based assay marketed by Amersham Pharmacia
Biotech (Little Chalfont, Buck inghamshire, UK), the SPA bead contains a fluor that emits a photon
of light upon excitation by a beta particle. In a typical assay format, the SPA bead is coated with
a receptor, for instance, and the ligand(s) to be tested are radioactively labeled. Upon a
biologically relevant interaction, beta particles emitted by the radio labeled ligand excite the
fluor contained within the bead. The excited fluor then emits a photon (light) of a given
wavelength that can be captured by a detector and measured. This is the principle of the SPA assay?
a homogeneous, radioactive assay for HTS.



TRF. The theory governing the HTRF/LANCE assay format is based upon energy transfer that occurs
when the molecules in a donor-acceptor pair are brought close to each other. In this manner, a
receptor-ligand interaction may be assayed by labeling the receptor with a donor molecule; chelated
europium is utilized for this purpose. The chelation prevents the europium from being quenched via
the environment, and the chelating molecule (the cage) around the europium serves as an antenna
collecting and concentrating the energy onto the europium. The acceptor moiety in this format is an
allophycocyanin (called XL-665) that accepts the energy from the europium and emits it at a
different wavelength. The TRF assay works because the europium has a long fluorescent lifetime [on
the order of microseconds]. Upon excitation, the europium holds the energy for the length of its
fluorescent lifetime, and subsequently releases via energy transfer to the acceptor molecule. This
time lag is utilized in the assay to screen out contaminating signal (i.e., from background that
has a low fluorescent lifetime). Once again, HTRF/LANCE is a homogeneous assay and does not involve
wash steps, allowing its implementation into the HTS realm.








FP. In addition to SPA and HTRF/LANCE, FP provides a powerful method for homogeneous [totally
solution-based] non radioactive assays for HTS. FP is based on the principle that a small molecule
(the fluorescent label) tumbles rapidly in solution, and if plane-polarized light is shone on such
molecules, the molecules tumble rapidly during the lifetime of the emission. This tumbling causes
the emission to be depolarized.


The need for HTS and UHTS in drug discovery is predicted to expand as we enter the new millennium.
In the future, there are significant opportunities that this space is expected to yield, some of
which include the following:


HTS assays that work across a wide variety of targets

High-information-content (multiplexed) assays

Cell-based assays for biologically true readouts

Integration of toxicological data into the up-front screening efforts


Miniaturization technologies. The three key drivers in the adoption (and implementation) of novel
technologies in the drug discovery paradigm include the need for: (1) robustness, (2) speed, and
(3) lower cost per data point. These factors are fueling the drive to miniaturize. Probably the
most dramatic impact of miniaturization technologies has been on the screening ensemble (presented
in previous section). The traditional 96-well format for screening (employed enterprise-wide) has
been revised to higher density formats, and 384-well and 1,536-well formats are now being
implemented across the industry. There are several objectives for miniaturization in HTS including:


Identification of better lead compounds

Optimization of these lead compounds

Cost savings (per sample) as throughput increases


One of the drivers in HTS today is the need to identify better lead compounds; these are
compounds that have a better chance to get through the rigorous drug discovery and development
process. Miniaturization is designed to accomplish this because it offers higher information
density per unit volume of sample that is screened. In other words, screening through lots of
compounds faster (and using less sample per screen) allows a greater amount of information to be
gathered in a given unit of time. In this manner, compounds with no downstream value can be triaged
earlier in the process. Lead optimization, the process of obtaining structure-activity
relationships (SAR) and using medicinal chemistry to find tighter leads that have a better chance
of making it through the process of discovery, is also benefited by miniaturization. Higher density
microplates (e.g., 1,536 wells) allow for more iterations of the same compound backbone (with
different substituents attached) to be examined in toto, thus offering better chances of finding
leads that have high efficacy and low toxicity in the next stages of discovery (i.e., preclinical
testing). As throughput increases, cost savings (per sample screened) can be realized quite
effectively. Figure 5 presents the history and future of miniaturization in screening.


Miniaturization technologies going forward. One of the interesting strategies that is emerging in
the miniaturization universe is the adoption of the so-called high-content screening, or HCS. In
HCS, a large number of data points are acquired per sample screened, thereby reducing the cost/data
point acquired. One of the key players in HCS is Cellomics, Inc. (Pittsburgh, PA). Cellomics’
platform seeks to perform automated, cell-based assays utilizing fluorescence as the readout.
Cellomics’ HCS platform is being designed to measure multiple pathways within the cell and
multiple entities lying in a single pathway, as well as to determine the differential responses of
cell sub-populations and morphological responses to stimuli.


In other words, Cellomics’ HCS is designed for the measurement of temporal and spatial events in
situ [in the cell’s microenvironment]. Cellomics seeks to achieve HCS by combining proprietary
instrumentation, screening assays, informatics backbones and proprietary databases. Their





ArrayScan™ system is an HCS system based upon an automated fixed endpoint screening system. This
contains an integrated robotic plate loader and a scanner capable of reading 96-, 384- and 1,536-
well plate formats. In addition, live cell kinetic assays can be performed using a live cell
chamber. All this is coupled with an optimized image acquisition and analysis system. Cellomics has
a collaboration with Zeiss (Jena, Germany) for detection instrumentation as well as with Molecular
Probes (Eugene, OR) for proprietary fluorescent dyes to be used as markers in the assays.


In addition to their ArrayScan system, Cellomics is building a CellChip™ system. This system is
composed of two-dimensional microarrays of cells coupled to microfluidic delivery of compounds, a
CellChip reader, cell-based biosensors (the assay), and a Cellomic™ database for information
mining. Cellomics’ strategy, therefore, is to achieve miniaturization using cellular readouts and
multiple data points acquired per sample. According to Cellomics, the fact that the data is
acquired out of live cells themselves makes it closer to the physiological situation than the data
acquired from in vitro binding assays.


Along with Cellomics, another company called SEQ Ltd. (Lawrenceville, NJ) is ing on the
miniaturization problem in the drug discovery business. SEQ’s platform is to build a well-by-well
fluorescence imager for screening and profiling. This company is utilizing confocal optics in order
to minimize the interference from compounds, plates, and media fluorescence. Ultimately, the goal
in miniaturization is the utilization of microfluidics and microchannels in a chip-based format.
Both ACLARA Biosciences (Mountain View, CA) and Caliper Technologies (Mountain View, CA) are
aggressively pursuing this space. It is, however, difficult to imagine the impact that these
technologies, which are still very experimental, will make in drug discovery.


Where are we going? So, why miniaturize? Why the need for faster screening operations? Ten years or
so ago, pharmaceutical scientists worked on a given target for years. These scientists collected
compounds from soil samples from remote parts of the Earth and synthesized compounds at random in
the laboratory. And what has come of that: costly and slow drug discovery and development as well
as toxicities, side effects, and a number of ineffective drugs! The ability to implement HTS
operations under miniaturized conditions promises to generate large amounts of relevant data on
compound safety, toxicity, efficacy, and action under a variety of conditions, all at a low cost.
This, in turn, will drive effective (and cost-effective) drug discovery and development, which will
serve to keep the pharmaceutical community turning out more tailor-made medications with high
efficacy and low toxicity.



★ Genotyping/Gene Expression Analysis Ensemble


SNP genotyping. From a commercial standpoint, several reasons for the recent burgeoning of interest
in SNPs stand out. First, a great deal of research will be done in this area over the next several
years; if companies like Affymetrix (Santa Clara, CA), PE Biosystems (Foster City, CA) and others
can supply the consumables and instruments to support this endeavor, they will generate substantial
chunks of business. Second, although massively parallel gene expression studies have already
accelerated the identification of new molecular targets for drug discovery, fundamental gene-
disease associations have not been established for the most prevalent genetically complex diseases.
The SNP endeavor promises to generate not only more targets, but better targets, ones that
represent a much more in-depth understanding of the associated disease processes. Finally,
pharmacogenomics has become a subject of intense interest for large pharmaceutical companies and
the venture companies that hope to serve them. The promise of drugs, both new and old, that can be
delivered to genetically responsive individuals and withheld from genetically averse individuals
compels such interest. Pharmacogenetics, which is an established albeit low-key field, studies
genetic variation underlying differential response to drugs.


Pharmacogenomics is the application of modern genomic technologies to the rapid discovery of drug-
response genes. There are several new technologies available for SNP genotyping, as illustrated
below:


Allele-specific hybridization

DNA chip [microarray] with fluorescence detection

5’-nuclease cleavage with FRET detection

Molecular beacon with fluorescence detection

Allele-specific PCR with intercalating dye fluores-cence detection

Allele-specific nucleotide incorporation

Minisequencing (single base extension) on microarrays with fluorescence detection

Minisequencing (single base extension) with FRET detection

Minisequencing (single base extension) with fluo-rescence polarization

Pyro-sequencing with fluorescence detection

Allele-specific oligonucleotide ligation

Oligonucleotide ligation on microarrays with fluo-rescencedetection

Dye-labeled oligonucleotide ligation with FRET detection

In essence, the need for SNP genotyping is going to be driven by the availability of genetic
targets that can be used to stratify populations [for a given phenotype, such as a disease].
Therefore, SNP genotyping technologies that will see expanded use in the new millennium will be
robust across SNP sequences, cost-effective and high-throughput. Figure 6 presents results of an
association study where a population was subdivided (stratified) using a particular SNP into the
two respective homozygote populations and a heterozygote population. Using such tools, researchers
will be able to correlate the genotype for some given marker in the genome with a given phenotype,
such as susceptibility to a particular disease.


Gene expression analysis. Genomic approaches, by definition, provide a wholesale way of
visualizing DNA. The cornerstone of genomics is parallel processing, in the extreme case examining
every gene at once. The parallel approach to examining genes is powerful since subtle patterns can
be revealed. In order to analyze the expression of genes in a parallel fashion, researchers have
attempted to tether massive numbers of genetic elements onto the same solid support at the same
time?this is the domain of DNA microchips. DNA microchips in the simplest definition are composed
of a solid support, such as a glass chip or a silicon wafer, adsorbed to which are pieces of DNA.


For analyzing the expression of a set of genes in vivo, mRNA from the cell is isolated and
purified. A complementary DNA (cDNA) of the mRNA is produced and hybridized to the chip.
Knowing the structure of the array, a positive signal (by way of fluorescence upon hybridization of
the added nucleic acid to the probe sequence on board the chip) allows the researcher to determine
which gene was expressed. Furthermore, the intensity of the spot [of fluorescence] on the chip
provides an indication of the extent of hybridization and thereby provides a quantitative index of
gene expression. As such, DNA arrays can be used for: (1) qualitatively assessing which gene(s) are
expressed and thereby deducing patterns of gene expression, and (2) quantitatively assessing the
expression of a gene(s) and thereby assessing the extent of gene expression (relative to some
control situation).


The most current approach to determining gene expression is to deploy DNA arrays loaded
with probes of interest to capture genetic elements of interest. Gene expression can be used
for gene discovery, classifying genes into categories (corresponding to different functional
elements in vivo), and ascribing biological function to genetic elements (functional genomics).


Where are we going? It is interesting to reflect on the s that have taken place in genotyping
and gene expression analysis in the last ten years. About ten years ago, researchers studied genes
in isolation, and their expression was indexed against maybe a handful of other genes. Genotyping
was low-throughput, costly, and performed against a few genetic elements that were not totally
characterized. At the dawn of the new millennium, however, we possess highly dense DNA microchips
onto which can be loaded the entire human genome. We can now study the expression of a given gene
in relation to every other gene in the human genome! We have genotyping technologies that can
quickly sort through the entire source of genetic variety and therefore elucidate fine structure
relationships between traits and markers. This is the power of genomics technologies as we enter
2000.



★ Proteomics: Protein Characterization/Expression


Proteomics aims to bridge the gap between genomics and the protein profile of each cell or tissue
at a given time. Direct separation and identification of all the proteins yields the true protein
profile of that tissue. Differences in the protein profiles of healthy and diseased tissues
pinpoint those proteins that are important in a particular pathology. Although




proteomics and functional genomics use similar strategies of comparing RNA or protein profiles in
normal and abnormal tissues, proteomics can reveal s that are undetectable by functional
genomics. Since synthesis and degradation of proteins can vary widely, in principle it is possible
that the message and protein profiles in a particular cell can be different at the same time.
Therefore, quantitative differences in the message profile can be misleading. Furthermore, aberrant
protein modifications can cause disease in the absence of any s in protein levels. Proteomics
can reveal abnormal localization of proteins both inside the cell and in interstitial cavities as
well. Proteomics aims to make possible the direct and simultaneous study of all or at least most of
the proteins in a particular cell and their interaction with one another.


In general, proteomics research can be divided into two categories. The first aims to profile a
limited number of well-characterized proteins in normal and diseased states; the second aims to
identify genome-wide alterations in the quantity and quality of unknown proteins. The former can
utilize more traditional protocols combined with recent developments in automation and chip
technology. The latter, however, requires more novel approaches since specific reagents for the
identification of these uncharacterized proteins do not yet exist.



★ Emerging Technologies in Proteomics: Overview and Analysis


The current state of the art involves the separation of proteins in complex mixtures via two-
dimensional (2-D) gel electrophoresis or chromatography combined with mass spectrometry. This is
currently tedious and the throughput is not comparable to DNA array technology for profiling gene
expression. Furthermore, certain proteins cannot be stained well in 2-D gels or the resolution and
detection levels from chromatography are not sensitive enough. These limitations are being
addressed in the form of protein arrays or chips. Certain protein chips produced by Ciphergen (Palo
Alto, CA) are already on the market. Figure 7 presents a schematic of the protein chip concept. The
idea is analogous to the DNA microchips?in this instance, massively parallel analysis of proteins.
Ciphergen’s ProteinChip™ system can quickly and sensitively profile the spectrum of proteins in
very small samples. By combining on-chip affinity purification and secondary chemical or
enzymatic processing with mass spectros, this tool provides specificity, sensitivity and speed.


Ultimately, the protein chip is a tool for the differential analysis of protein expression, much in
the same manner as DNA chips analyze differential gene expression. For example, the protein chip is
capable of producing a profile of proteins?up- and down-regulated? between healthy individuals and
patients. In this manner, the proteinaceous lesions responsible for disease can be mapped and
correlated with, say, genotype of SNPs or particular genes that are expressed in vivo.


Where are we going? In summary, proteomics provides a framework of technologies for the high-
throughput identification, characterization and understanding of proteins discovered via the
genomics ensemble. Ten years ago, there were few targets and most of the proteins available were
studied individually, with their functions studied in isolation. In the new paradigm of drug
discovery, thousands of genes identified (via genomics) will be analyzed and their products
characterized via proteomics. As we progress into the new millennium, we will find technologies for
proteomics that are highly multiplexed, cost-effective, and data-intensive. In other words,
genomics and proteomics will merge to seamlessly create a bridge from gene to function.



★ In Conclusion


Looking back over the past decade as we enter the new millennium, the technological approaches that
seem to have had the greatest influence on drug discovery and development have included new
technologies for pharmaceutical screening, genomics and protein structure/function understanding.
Each of the technologies chosen and presented in this article has made an impact on the industry
and is expected to influence the landscape in the coming years. The novel technologies presented
herein, however, are by no means the only ones available; we have chosen vignettes of technologies
that have distinguished themselves. Some things to expect in the future are as follows:


Screening technologies becoming more robust and higher in throughput will serve to help ease the
bottle neck in discovery as well as serve to restrain rising drug discovery/development costs

Genomics will become the key pivot around which discovery will revolve; better, faster, cheaper
ways of genotyping will mean tailor-made drugs and fewer side effects

Proteomics will mean greater understanding of biology as a whole, the result being


more effective drug discovery and development and better medicines In the next ten years, the above
mentioned technologies will very likely become commonplace, implemented by the pharmaceutical
industry as well as by physicians. The coming years will see tremendous leaps in the understanding
of biology and disease; as such, it is difficult to predict what we will be writing about in 2010
in terms of drug discovery and development.




(source : http://www.bio.com)

자료 추천하기

받는 사람 이메일
@
메일 내용