Toxicogenomics


Toxicogenomics is a subdiscipline of pharmacology that deals with the collection, interpretation, and storage of information about gene and protein activity within a particular cell or tissue of an organism in response to exposure to toxic substances. Toxicogenomics combines toxicology with genomics or other high-throughput molecular profiling technologies such as transcriptomics, proteomics and metabolomics. Toxicogenomics endeavors to elucidate the molecular mechanisms evolved in the expression of toxicity, and to derive molecular expression patterns that predict toxicity or the genetic susceptibility to it.

Pharmaceutical research

In pharmaceutical research, toxicogenomics is defined as the study of the structure and function of the genome as it responds to adverse xenobiotic exposure. It is the toxicological subdiscipline of pharmacogenomics, which is broadly defined as the study of inter-individual variations in whole-genome or candidate gene single-nucleotide polymorphism maps, haplotype markers, and alterations in gene expression that might correlate with drug responses. Though the term toxicogenomics first appeared in the literature in 1999, it was by that time already in common use within the pharmaceutical industry as its origin was driven by marketing strategies from vendor companies. The term is still not universally accepted, and others have offered alternative terms such as chemogenomics to describe essentially the same field of study.

Bioinformatics

The nature and complexity of the data demands highly developed processes of automated handling and storage. The analysis usually involves a wide array of bioinformatics and statistics, often including statistical classification approaches.

Drug discovery

In pharmaceutical drug discovery and development, toxicogenomics is used to study possible adverse effects of pharmaceutical drugs in defined model systems in order to draw conclusions on the toxic risk to patients or the environment. Both the United States Environmental Protection Agency and the Food and Drug Administration currently preclude basing regulatory decision-making on genomics data alone. However, they do encourage the voluntary submission of well-documented, quality genomics data. Both agencies are considering the use of submitted data on a case-by-case basis for assessment purposes or for populating relevant comparative databases by encouraging parallel submissions of genomics data and traditional toxicological test results.

Public projects

Chemical Effects in Biological Systems is a project hosted by the National Institute of Environmental Health Sciences building a knowledge base of toxicology studies including study design, clinical pathology, and histopathology and toxicogenomics data.
InnoMed PredTox assesses the value of combining results from various omics technologies together with the results from more conventional toxicology methods in more informed decision-making in preclinical safety evaluation.
Open TG-GATEs is a Japanese public-private effort which has published gene expression and pathology information for more than 170 compounds.
The Predictive Safety Testing Consortium aims to identify and clinically qualify safety biomarkers for regulatory use as part of the FDA's "Critical Path Initiative".
ToxCast is a program for Predicting Hazard, Characterizing Toxicity Pathways, and Prioritizing the Toxicity Testing of Environmental Chemicals at the United States Environmental Protection Agency.
Tox21 is a federal collaboration involving the National Institutes of Health, Environmental Protection Agency, and Food and Drug Administration, is aimed at developing better toxicity assessment methods. Within this project the toxic effects of chemical compounds on cell lines derived from the 1000 Genomes Project individuals were assessed and associations with genetic markers were determined. Parts of this data were used in the NIEHS-NCATS-UNC DREAM Toxicogenetics Challenge in order to determine methods for cytotoxicity predictions for individuals.