| Literature DB >> 32326418 |
Pia Anneli Sofia Kinaret1,2,3, Angela Serra1,2, Antonio Federico1,2, Pekka Kohonen4,5, Penny Nymark4,5, Irene Liampa6, My Kieu Ha7,8,9, Jang-Sik Choi7,8,9, Karolina Jagiello10,11, Natasha Sanabria12, Georgia Melagraki13, Luca Cattelani1,2, Michele Fratello1,2, Haralambos Sarimveis6, Antreas Afantitis13, Tae-Hyun Yoon7,8,9, Mary Gulumian12,14, Roland Grafström4,5, Tomasz Puzyn10,11, Dario Greco1,2,3.
Abstract
The starting point of successful hazard assessment is the generation of unbiased and trustworthy data. Conventional toxicity testing deals with extensive observations of phenotypic endpoints in vivo and complementing in vitro models. The increasing development of novel materials and chemical compounds dictates the need for a better understanding of the molecular changes occurring in exposed biological systems. Transcriptomics enables the exploration of organisms' responses to environmental, chemical, and physical agents by observing the molecular alterations in more detail. Toxicogenomics integrates classical toxicology with omics assays, thus allowing the characterization of the mechanism of action (MOA) of chemical compounds, novel small molecules, and engineered nanomaterials (ENMs). Lack of standardization in data generation and analysis currently hampers the full exploitation of toxicogenomics-based evidence in risk assessment. To fill this gap, TGx methods need to take into account appropriate experimental design and possible pitfalls in the transcriptomic analyses as well as data generation and sharing that adhere to the FAIR (Findable, Accessible, Interoperable, and Reusable) principles. In this review, we summarize the recent advancements in the design and analysis of DNA microarray, RNA sequencing (RNA-Seq), and single-cell RNA-Seq (scRNA-Seq) data. We provide guidelines on exposure time, dose and complex endpoint selection, sample quality considerations and sample randomization. Furthermore, we summarize publicly available data resources and highlight applications of TGx data to understand and predict chemical toxicity potential. Additionally, we discuss the efforts to implement TGx into regulatory decision making to promote alternative methods for risk assessment and to support the 3R (reduction, refinement, and replacement) concept. This review is the first part of a three-article series on Transcriptomics in Toxicogenomics. These initial considerations on Experimental Design, Technologies, Publicly Available Data, Regulatory Aspects, are the starting point for further rigorous and reliable data preprocessing and modeling, described in the second and third part of the review series.Entities:
Keywords: alternative risk assessment; engineered nanomaterials (ENM); experimental design; high throughput; microarrays; sequencing; toxicogenomics (TGx); toxicology; transcriptomics
Year: 2020 PMID: 32326418 PMCID: PMC7221878 DOI: 10.3390/nano10040750
Source DB: PubMed Journal: Nanomaterials (Basel) ISSN: 2079-4991 Impact factor: 5.076
Example of the phenodata table reporting.
| Sample | Sample ID | Date | Material | Dose | Time | Day | Array.N | Hybr Date | Slot | Slid Bar Code | Operator |
|---|---|---|---|---|---|---|---|---|---|---|---|
| 12 | 121 | 12/03/2019 |
| 10 | 24 |
| 2 | 18.3. |
|
| PKI |