Literature DB >> 35860623

Toxicoepigenetics for Risk Assessment: Bridging the Gap Between Basic and Regulatory Science.

Anne Le Goff1, Séverine Louvel2, Henri Boullier3, Patrick Allard1,4.   

Abstract

Toxicoepigenetics examines the health effects of environmental exposure associated with, or mediated by, changes in the epigenome. Despite high expectations, toxicoepigenomic data and methods have yet to become significantly utilized in chemical risk assessment. This article draws on a social science framework to highlight hitherto overlooked structural barriers to the incorporation of toxicoepigenetics in risk assessment and to propose ways forward. The present barriers stem not only from the lack of maturity of the field but also from differences in constraints and standards between the data produced by toxicoepigenetics and the regulatory science data that risk assessment processes require. Criteria and strategies that frame the validation of knowledge used for regulatory purposes limit the application of basic research in toxicoepigenetics toward risk assessment. First, the need in regulatory toxicology for standardized methods that form a consensus between regulatory agencies, basic research, and the industry conflicts with the wealth of heterogeneous data in toxicoepigenetics. Second, molecular epigenetic data do not readily translate into typical toxicological endpoints. Third, toxicoepigenetics investigates new forms of toxicity, in particular low-dose and long-term effects, that do not align well with the traditional framework of regulatory toxicology. We propose that increasing the usefulness of epigenetic data for risk assessment will require deliberate efforts on the part of the toxicoepigenetics community in 4 areas: fostering the understanding of epigenetics among risk assessors, developing knowledge infrastructure to demonstrate applicability, facilitating the normalization and exchange of data, and opening the field to other stakeholders.
© The Author(s) 2022.

Entities:  

Keywords:  Toxicoepigenetics; epigenetics; regulatory science; risk assessment; toxicoepigenomics

Year:  2022        PMID: 35860623      PMCID: PMC9290111          DOI: 10.1177/25168657221113149

Source DB:  PubMed          Journal:  Epigenet Insights        ISSN: 2516-8657


Introduction

The epigenome, that is, the collection of covalent chemical modifications to the DNA and histone proteins, and of non-coding RNAs that regulate gene expression in a heritable fashion, is increasingly recognized as a key mediator of environmental response and a target of toxicants. Toxicoepigenetics leverages and combines advances in the fields of epigenetics and toxicology to elucidate molecular initiating events, a molecular basis for delineating windows of environmental sensitivity,[3,4] a mechanism of transgenerational inheritance running parallel to that of DNA, biomarkers of current or prior exposure, and potential therapeutic avenues. However, to date, the potential contribution of toxicoepigenetics to chemical risk assessment has yet to concretely materialize. Convergence remains sparse, even on more limited goals, despite the sustained commitment of a significant number of basic scientists and risk assessors to advancing this issue through workshops and reviews. This article examines why. Previous commentators on this limited integration of toxicoepigenetics into risk assessment have highlighted technical, methodological, and scientific obstacles.[7-11] While these barriers are quite significant, the focus of our article is another side of the issue, one that has largely been overlooked. Using a social science approach, we consider potential obstacles stemming not only from the quantity or quality of the science itself, but from science-making—that is, obstacles arising from differences between the practices of basic science research and risk assessment. The knowledge basis for risk assessment is regulatory science, that is, data, methods, and tools used to support regulation and policy making. This knowledge is produced by academic institutions, government agencies, or private companies. Scholars in the social studies of science have argued that criteria for relevance and standards of evidence—the definition of what constitutes “best science” or “sound science”—depend to some extent on the context of the application of science and that regulatory science differs in this regard from other forms of science-making.[13,14] The general purpose of regulatory science is to answer policy-relevant questions, for instance whether a given chemical can be sold on the market or should be restricted. Such policy goals, as well as constraints such as the practical need to develop assays that are time- and cost-effective, weigh on the production of regulatory science. Regulatory policy constraints are absent from basic science contexts: while research laboratories usually produce data for academic journals and the advancement of science, regulatory science responds to normative regulatory demands. Consequently, the production of regulatory evidence for a risk proceeds in a distinctly different fashion from the modes of constructing evidence in basic research.[15-17] Not all scientific knowledge is included in risk assessment processes. The relevance of science to this purpose is established in reference to risk assessment tools that are the result of a long process of formalizing procedures and standardizing methods that began in the 1980s (see Box 1). The development of this toolbox is simultaneously a technical process and a social and political challenge: principles and protocols for producing evidence (eg, to document the toxicity of a chemical) must be discussed, refined, and validated according to technical procedures; agencies and industries need to reach a consensus as the users of these new standards. Therefore, risk assessment has an “evidential culture” with a high threshold—that is, criteria and practices that admit forms of knowledge that have been historically validated and follow well-established rules in the construction of evidence.[15,18] The use of formalized tests, methods, and procedures to estimate the safety of a given chemical for human health is further dependent on the type of chemical—for example, pesticides, pharmaceuticals, or environmental pollutants—and the kind of assessment—for example, environmental or occupational exposure; short or long-term toxicity. Taking these practical issues into account is extremely relevant when exploring reasons for the lack of integration of toxicoepigenetics in risk assessment.
Box 1.

Regulation of chemicals in the US and EU.

• For more than 40 years, most industrial chemicals marketed in the United States have been assessed under procedures set out in a single piece of regulation, the 1976 Toxic Substances Control Act (TSCA), amended by the 2016 Frank R. Lautenberg Chemical Safety for the 21st Century Act.[19,20] Under this framework, the US Environmental Protection Agency (EPA) controls the marketing of new compounds and assesses existing compounds.• In Europe, similar provisions are laid out in the Registration, Evaluation, Authorisation and Restriction of Chemicals (REACH) regulation adopted in 2006 and applied by the European Chemicals Agency (ECHA). 21 • Both regulatory agencies follow the principles described in the so-called Red Book of “Risk Assessment in the Federal Government” produced by the US National Research Council, 22 which describes a four-step approach to the evaluation of risks, namely (1) hazard identification, (2) dose-response assessment, (3) exposure assessment, and (4) risk characterization.• The principles of the Red Book, along with the regulatory frameworks that were adopted in the past decades, have encouraged the development of a community of professionals (in particular, regulatory toxicologists) who participate in the production of data on chemicals following standardized tests capable of documenting risks. Over time, the Organization for Economic Co-operation and Development (OECD) has managed to harmonize local testing protocols into international test guidelines that are presently the gold standard for the regulatory assessment of chemicals.[18,23]
Regulation of chemicals in the US and EU. In the present commentary, we investigate whether structural differences between the fields of basic research in toxicoepigenetics and risk assessment impede the transfer of epigenetic data. A small but persistent community of scientists involved in regulatory and basic science has formed around this issue. They have promoted paths for use of toxicoepigenetic data in risk assessment and encouraged debates about bottlenecks via workshops, sessions at major toxicology conferences, and review articles and opinion pieces in specialized and interdisciplinary journals. Here, we examine these efforts, using a qualitative approach to identify barriers to the process of including epigenetic data in risk assessment. We analyzed this literature and interviewed a dozen experts active in these debates, specifically, scientists with positions in academia (n = 3), industry (n = 2), regulatory agencies (n = 2), an environmental NGO (1), and risk assessors in governmental agencies (n = 2). All but one were in the US. (The study was certified exempt by the Institutional Review Board (IRB) of the University of California, Los Angeles (UCLA), (IRB No.18-000621). Lists of the publications analyzed, and interview questions can be found in the Supplemental Materials section). Through a qualitative and focused sociological approach, we aim to provide an in-depth analysis of the barriers that impede the use of toxicoepigenetics in risk assessment. In that context, a qualitative approach best captures first-hand experience of knowledge exchange and barriers. We selected interviewees using a selective sampling strategy focused on generating insights into key issues, rather than generalizing from a sample to a population. Thus, our relatively small-scale interviews were not taken to be representative of the existing range of views on toxicoepigenetics in risk assessment. Instead, we interviewed scientists and risk assessors who have an expert knowledge of their interface and are directly involved through their research or institutional work in debates over the use of toxicoepigenetics in risk assessment. We analyzed the data from interviews using an inductive approach, identifying themes that emerged out of the interviews. We compared interview data with reviews and meeting reports that specifically address the intersection between toxicoepigenetics and risk assessment, so as to identify recurring themes, issues, and ways to move forward that are currently under discussion in the field. In the present perspective article, we will highlight the gaps between toxicoepigenetic data produced by basic science and the standards for regulatory science that constitute the knowledge basis of risk assessment. Then, we will outline several ways to address these barriers and indicate areas in which toxicoepigenetics could most readily contribute to risk assessment.

The Promise and Current Status of Toxicoepigenetics for Risk Assessment

Toxicoepigenetics: A booming area of research

Toxicoepigenetics is a rapidly expanding area of research, whose development parallels that of epigenetics. The annual volume of publications has continuously increased since 2000, and by a factor of 2.66 in the last 10 years (Figure 1; a description of our literature review approach can be found in the Supplemental Materials section). Over the past 2 decades, research agencies have been strongly supportive of the field, for example, by funding multidisciplinary consortia in toxicoepigenetics both in the United States (NIH Roadmap Epigenomics Project, TaRGET consortium, other NIEHS program solicitations) and in the European Union (eg, the Cooperation Work Programmes for Environment and Health of the seventh Framework Programme). The field has benefited from fast technological advances which enable laboratories to acquire and analyze steadily increasing volumes of data which now include multi- and single cell ‘omics. Toxicoepigenetics has been gradually expanding in scope and methodologies and now stands at the intersection of a large number of research areas that include toxicology but also oncology, biochemistry, molecular biology, genetics, developmental biology, and molecular epidemiology (Figure 2). While epigenetic studies have been performed on many toxicants (130 in 2021), the bulk of them focuses on a small number of chemicals, namely certain metals and metalloids (arsenic, lead, cadmium, methylmercury, nickel, chromium), bisphenol A (BPA), and polycyclic aromatic hydrocarbons (PAHs) (Figure 3).
Figure 1.

Volume of articles in toxicoepigenetics (original research only), in relation to the volume of articles in epigenetics.

Source: Web of Science.

Figure 2.

Top 10 disciplines of journals publishing research in toxicoepigenetics.

Source: Web of Science.

Figure 3.

Toxicants most studied in toxicoepigenetics.

Source: Web of Science.

Volume of articles in toxicoepigenetics (original research only), in relation to the volume of articles in epigenetics. Source: Web of Science. Top 10 disciplines of journals publishing research in toxicoepigenetics. Source: Web of Science. Toxicants most studied in toxicoepigenetics. Source: Web of Science. Studies have shown associations between environmental exposure and changes in DNA methylation, histone modifications, and non-coding RNA that together alter gene expression and chromatin structure stably across cell divisions and generations. Toxicoepigenetics highlights novel forms of toxicity that are irreducible to known forms of cytotoxicity or genotoxicity. In particular, lower doses than those generally considered to be safe can have deleterious long-term effects through changes in the epigenome, highlighting the significance of critical windows of exposure. One of the most fruitful areas of application of toxicoepigenetics is the Developmental Origins of Health and Disease (DOHaD) paradigm, which builds on the association between epigenetic changes in early life and the development of adverse health outcomes, such as metabolic disorders, cancer, or other diseases, later in life.

Expectations: The usefulness of toxicoepigenetics for risk assessment

The contributions to risk assessment that are expected of toxicoepigenetics range from insight into the mechanism of action to novel biomarkers of exposure prior to changes in gene expression, cell signaling, or pathological findings.[2,27,28] Potential future applications include measuring the long-term susceptibility of a population to a given environmental exposure; developing epigenetic tests to detect at-risk, sensitive populations; and intervening at the molecular level to reverse the epigenetic changes implicated in the toxicity response to exposure.[3,29,30] In the present, promoters of toxicoepigenetics support relatively humble goals, namely, to contribute to well-established risk assessment processes and complement existing methods when they harbor gaps. “[T]he intention is not to create new in vivo test methods solely for epigenetics.” (p21) Two major directions for using epigenetic data in risk assessment procedures emerge from the literature. First, it can contribute to the weight of evidence in assessing chemical compounds. The weight of evidence approach is a systematic method for decision-making that “involves consideration of known lines of evidence (LoEs) where a ‘weight’ is assigned to each LoE, according to its relevance and reliability.” (p9) Epigenetic data may help corroborate other lines of evidence when only incomplete evidence exists.[32-35] In such a holistic approach, demonstration that an epigenetic modification is adverse may not be a precondition. Departure from a normal-range epigenome may become indicative of toxicity in the context of other genomic or physiological evidence. Second, toxicoepigenetics may help prioritize toxicants of concern, especially in the context of the new alternative methods (NAMs) strategy.[36,37] In the past, alternative methods such as structure-activity relationships (SARs) have shown their potential to predict risks for new chemicals by quickly providing initial data that can then be investigated by conventional and more time-consuming and expensive risk assessment methods. Even in the absence of a predictive relationship between epigenetic changes and specific health outcomes, stable epigenetic changes in known biological pathways can demonstrate the ability of a chemical of interest to disrupt the epigenome and prompt further investigation with conventional toxicological assays.[38,39]

A lack of concrete progress

In the early 2010s, an OECD expert group working in close collaboration with the OECD advisory group on testing and assessment of endocrine disruptors (EDTA-AG) made suggestions for integration of epigenetics into OECD chemical safety assessment regulatory activities, in the context of the development of new or revised Test Guidelines (TGs) for the detection of endocrine active chemicals and endocrine disruptors. An OECD detailed review paper suggested that epigenomic dysregulation played a key role in mediating the effects of exposures to endocrine disruptors and that several epigenetic endpoints should be considered for inclusion in the guidelines. Suggestions by the group of experts ranged from recommendations on published cell culture and animal systems that could potentially be used for testing the epigenetic effects of endocrine disruptors, to identifying OECD Test Guidelines that could potentially be adapted for epigenomic studies of the effects of endocrine disruptors (eg, TGs 415 and 416 for testing 1- and 2-generation reproduction toxicity). In 2016, a second OECD working group was created to develop an integrated approach to the testing and assessment (IATA) of chemical non-genotoxic carcinogens, including “epigenetic carcinogens.” The purpose of this approach is to assist regulators in their assessment of chemical non-genotoxic carcinogens with the clustering of relevant assays—including epigenetic ones—that can be used to address the biological processes associated with cancer onset and progression. However, progress has been slow. Tests in the OECD Conceptual Framework have not been updated to include epigenetic endpoints, on the grounds that it is “still too early at this time to augment current test guidelines, which have been used extensively”; that is to say, “their value is known” while the added value of epigenetic tests is not. (p28) In addition, no agency has yet developed detailed epigenomic risk assessment guidance although guidelines published by major regulatory agencies (eg, the OECD, the USEPA and the European Chemical Agency) acknowledge epigenetics as a regulatory mechanism for endocrine-disrupting modes of action. Lastly, there are currently no testing guidelines for combining relevant epigenetic biomarkers and tools with standardized, commonly used tests. Although not unusual for new sciences, this situation can be attributed to significant structural differences between the types of knowledge produced and used in the contexts of basic research in epigenetics versus regulatory science.

From Basic Research in Toxicoepigenetics to Risk Assessment Applications: Current Barriers

Standardizing toxicoepigenetic data

The expansion of toxicoepigenetics has been accompanied by strong dynamics of scientific exploration and technological innovation, leading to the accumulation of voluminous but heterogeneous data which has often hindered the comparability of results. The heterogeneity of epigenetic data makes it challenging to compare and interpret results, not only for research purposes but also for risk assessment. For example, advances in DNA methylation assessment non-exhaustively include several generations of methylated DNA arrays, methylated DNA immunoprecipitation, reduced representation bisulfite sequencing (RRBS), and whole-genome bisulfite sequencing (WGBS). These technological advances differ in their library preparation, resolution, and epigenome coverage. Differences at the detection level are compounded by differences in analytical and statistical tools which result in different datasets that are difficult to compare.[43-45] The complexity of assessment is compounded by the variety of tissues and cell types that are subjected to it, that is, the epigenome shows age-, tissue- and cell-specific patterns and epigenetic changes that display some, but limited, correlation across samples.[3,46] Cell-type heterogeneity represents a major hurdle for integrating toxicoepigenetic data in risk assessment. To date, in humans, it remains unclear whether surrogate, accessible tissues such as blood or placenta provide an accurate indication of epigenetic changes in target tissues such as brain, heart, or liver that cannot be easily collected from human donors. The field of epigenetic toxicology has started developing strategies to address these challenges. A particular subset of epigenetics literature has emerged that is specifically aimed at comparing methylomics platforms and improving data interpretation.[47,48] The integrative analysis of different epigenetic parameters offers a more complete and dynamic view of the epigenome and facilitates the functional interpretation of epigenetic changes. Single-cell epigenomics allows interrogating the transcriptome and epigenetic marks (5mC, histone modification, non-coding RNA) at single-cell resolution, and therefore provides tissue and cell-type specificity. However, these approaches require new computational pipelines for data analysis, which again poses a difficulty in establishing standards and comparing datasets that have been generated across different platforms and methods. Significant investments have also been made to coordinate research activities in epigenetics. Institutions have shown interest in building tools to establish standards and systematize knowledge, in particular with the goal of producing reference epigenomes. After the NIH Roadmap Epigenomics Project led in 2015 to the profiling of 111 reference epigenomes, the International Human Epigenome Consortium (IHEC) aims to complete at least 1000 reference epigenomes and establish standards for assays and metadata. The NIEHS-funded TaRGET Consortium program is focused on 6 selected substances and aims to standardize research protocols in terms of exposure conditions, target and surrogate tissues, cell types, and assays. The TaRGET consortium aims to generate epigenetic signatures of exposure across tissues that offer a baseline for assessing epigenetic effects of exposure, that further can support the production of human epigenetic data. Such scientific and institutional strategies will undoubtedly contribute to an increased understanding of epigenetic variability in the face of exposures and improve the comparability and reproducibility of epigenetic data in research. However, this effort to homogenize and standardize research protocols will not immediately nor directly bear on the usefulness of epigenetic data for risk assessment, as it does not cover all of the knowledge needs of regulatory toxicology. For instance, the challenges of using noncoding RNAs (ncRNAs) as biomarkers do not only stem from the absence of standardized procedures for data normalization and referencing, and the need to develop statistical estimators and historical control data to ensure comparability and facilitate assessment. They also relate to the need to incorporate them into the standardized in vitro assays and in vivo repeated-dose studies used in risk assessment. It may require adapting the guidelines for those assays, for example, defining test group sizes that will provide sufficient statistical power to detect ncRNA effect sizes.

Toxicological relevance of epigenetic data

The second major issue in the use of epigenetic datasets is their interpretation. There is widespread agreement in the literature and among interviewed stakeholders on the challenges of translating epigenetic molecular mechanisms into relevant physiological endpoints. Epigenetic changes do not necessarily alter gene function; rather, akin to mutations, they can be neutral, adaptive, adverse, or emergent, that is, resulting in an adverse effect at a later stage. In addition, in contrast with genetic mutations, the persistence of epigenetic changes is variable. Therefore, epigenetic endpoints often do not offer physiologically interpretable data in terms of adverse effects, and further health outcomes.[7,32,34,35,55-57] The identification of persistent adverse effects is at the core the hazard identification component of risk assessment, while they need to be quantifiable in order to be incorporated into a dose-response assessment. The identification of epigenetic adverse effects is rendered difficult by the absence of a standard normal epigenome in the context of tissue, age, and population variability and the fundamental plasticity of the epigenome. Therefore, it becomes particularly challenging in a regulatory assessment process to identify true adverse epigenetic effects from background noise in order to describe new, often undocumented, modes of action and health effects of environmental stressors. Arsenic is a case in point. Arsenic is a well-established Group 1 human carcinogen but it does not show overt genotoxicity. In light of its prevalence and the severity of its health effects, it stands in first place on the US Agency for Toxic Substance Priority List and has been the subject of a robust funding effort by federal agencies over the past decade. Arsenic is currently, by far, the most studied toxicant in toxicoepigenetics (Figure 3). Epigenetics appears to be a promising approach as several epigenetic pathways are involved in mediating arsenic toxicity. Chronic arsenic exposure shows a dose-response relationship with changes in DNA methylation, including in genes associated with arsenic-mediated diseases, as well as histone post-translational modifications and changes in microRNA expression.[60-62] Commentators have highlighted several lines of application of epigenetic data on arsenic to risk assessment. Mechanistic insights into the epigenetic changes associated with arsenic exposure could help identify modes and mechanisms of action and contribute to the weight of evidence used for the risk assessment of arsenic toxicity. Epigenetic changes associated with arsenic exposure could also serve as biomarkers of arsenic exposure, even below thresholds at which disease outcomes or pathological changes are apparent. Epigenetic studies may allow for a better characterization of the variation of human response to arsenic exposure, in particular regarding early-life exposure.[63,64] However, the wealth of toxicoepigenetic data produced on arsenic has yet to be significantly utilized by regulators. Basic scientists need to demonstrate how it can be interpreted and fit with conventional toxicological data in order to build a weight of evidence that can be used in risk assessment. Here, the key obstacle is the difficulty in interpreting epigenetic changes and linking them to phenotypes.[62,65] Most human and animal studies of arsenic in toxicoepigenetics have focused on detecting changes in the epigenome that correlate with arsenic exposure without identifying a causal relationship between epigenetic changes and disease outcomes. At a minimum, studies of exposure should include functional measures such as gene expression. Instead, they tend to rely on other health studies showing that the type of epigenetic alterations that they detected were associated with disease phenotype to speculate on the contribution of arsenic-caused epigenetic changes to broad disease phenotypes.

Novel forms of toxicity

One of the most intriguing aspects of epigenetics is the—sometimes long—delay observed between epigenetic changes due to exposure and disease outcome. While it harbors a strong potential for public health innovation, as the dynamic field of DOHaD demonstrates, it also represents a significant challenge for risk assessment. One illustrative example can be found in a study performed with the endocrine disruptor bisphenol A (BPA). Briefly, in this study, early postnatal exposures to environmentally low doses of BPA were performed. Under normal laboratory conditions including a normal diet, the adult animals displayed body weight and metabolic profiles that were fully within the normal range. However, following a diet challenge (“Western diet,” higher in fat) in adulthood, male showed a dramatic alteration of their lipidomic and metabolomic profiles, and an enlarged liver indicative of a profound metabolic dysfunction. Expression and epigenomic analyses revealed that the Egr1 locus, which mediates diet and liver metabolic response, was epigenetically reprogramed by the early-life BPA exposure. These epigenetic alterations did not translate into transcriptional or pathophysiological changes until the second, later-life, Western diet challenge, pushing the animal into metabolic dysfunction. These results are both incredibly relevant to real life, from the timing of exposure to the incorporation of a Western-style diet, and particularly challenging to funnel into a risk assessment process. What should be assessed: the diet, the exposure, or the interaction of both? Changes in the epigenome can be quantified following exposure but they do not immediately translate into pathological or metabolic findings, with pathological effects correlated to exposure occurring much later in life. Thus, in this context, exposure to BPA could be considered as a sensitizer but not necessarily as a chemical exposure that directly causes metabolic dysfunction. Such findings beg the question as to whether other EDCs should be tested in a similar fashion for risk assessment purposes. Yet the detection of these effects would require a significant transformation of the methods used for measuring toxicity in order to explore a wider range of doses that would include ultra-low doses but also test them across a large array of exposure windows and in concert with co-exposure or secondary challenges as seen above. Hence, the difficulty in implementing an epigenetic framework for toxicity testing and risk assessment includes rebarbative costs and amounts of time, without necessarily being able to directly, mechanistically, link epigenetic changes with pathological and disease findings months or years later. Bridging the gap of testing these long-term toxic effects is both the exciting future of the field and its biggest challenge for risk assessment yet.

Mapping Routes for the Advancement of Toxicoepigenetics in Risk Assessment

Even relatively modest, promising directions for toxicoepigenetics to contribute to risk assessment have faced stumbling blocks. Many of these challenges are not unique to toxicoepigenetics but materialized in the past for other innovative branches of toxicology. We will now highlight 4 areas in which deliberate efforts on the part of the toxicoepigenomics community may be fruitful to gather broader support beyond epigeneticists and to overcome barriers.

Develop knowledge infrastructure to demonstrate applicability

Perhaps the most profound challenge to using epigenetic data in risk assessment is uncertainty toward its physiological relevance. This uncertainty is not specific to toxicoepigenomics but applies to molecular data in general. In the past, toxicogenomics tackled this issue with the strategy of “phenotypic anchoring,” that is, “coupl[ing] the unique gene expression patterns induced by chemical exposures” “to standard toxicological indices, such as clinical chemistry or tissue pathology.” In other words, researchers in toxicogenomics strove to systematically “ground” novel, genomic, methods in well-established and proven toxicological methods. Such a systematic effort to quantitatively and systematically link epigenomic data to traditional toxicological endpoints has not been undertaken yet. The aforementioned heterogeneity of epigenetic data, variation of the epigenome across tissues and time, and uncertainty that epigenetic changes alter gene function render such a form of direct coupling unlikely. However, stakeholders have considered other strategies to identify physiological relevance, such as prototype assessments.[35,71,72] Also known as case studies or model compounds, prototype assessments can serve as a proof of concept to demonstrate the relevance of epigenetic endpoints by assessing epigenetic changes caused by compounds for which there is substantial prior knowledge of apical endpoints. Ultimately, this strategy can also support the interpretation of epigenetic data and validate epigenetic models of toxicity. Prototype assessments with epigenomic data have been conducted within the Next Generation (NexGen) of Risk Assessment program initiated by the EPA for benzene and other leukemogens, ozone, and polycyclic aromatic hydrocarbons (PAHs). Prototype assessments can be used in conjunction with qualitative models such as adverse outcome pathways (AOPs). AOPs appear to be a promising avenue for the integration of epigenetic data into risk assessment.[33,54,74] An AOP is a structured representation of biological evidence that connects a molecular initiating event to an adverse outcome through the identification of key biological events, drawing on multiple, heterogeneous datasets.[75,76] AOP frameworks demonstrate biological plausibility by organizing the existing evidence into holistic, visually representable networks. As sociology of science has shown, visual displays—charts, figures, and tables—organize data in a way that not only helps represent it but in itself produces analyses. Angrish and coauthors propose for example that an AOP framework can help interpret the multifarious epigenetic data related to arsenic exposure. The AOP framework addresses the issue of lack of physiological relevance by linking molecular changes in DNA methylation, histones, and small noncoding RNAs to typical toxicological endpoints at the cell, tissue, and organ levels. It supports the interpretation, and therefore relevance, of epigenetic “information that, if not properly contextualized, is otherwise perceived as tenuous” or mere noise. (p8) Considered within this framework, epigenetic data may acquire relevance to physiological endpoints that it does not have on its own. The framework also highlights data gaps for epigenetic research to address in order to be significant to risk assessment. AOPs have been developed from within regulatory science, with the support and guidance of the OECD. AOPs that pass the review process are endorsed by the OECD. Successfully scaling up the AOP framework depends on involving a heterogenous group of stakeholders. This may constitute an opportunity for researchers in toxicoepigenetics who strive to contribute to the pipeline of data usable in risk assessment. The development of AOPs offers a concrete platform for collaboration with regulatory science, in particular through a wiki-based, open-source interface.

Facilitate the normalization and exchange of data

In the words of our interlocutors, the integration of toxicoepigenetics into risk assessment further depends on “making sure that we are speaking each other’s language.” Social studies of science have shown the efficiency of building “boundary objects,” that is, scientific objects that help groups of actors coming from different perspectives with different objectives (eg, basic, regulatory, and industry scientists) share a frame of reference and work toward common goals. In particular, information infrastructures (such as databases, directories, reporting tools, standard procedures, guiding principles of data interpretation) operate as boundary objects in 2 ways: first, they facilitate the circulation of information between practitioners; second, they support data interpretation and allow for the development of shared meanings. However, there has been a significant lag in the construction of such boundary objects in toxicoepigenetics. Epigenomic data is noticeably absent from existing large-scale governmental toxicology databases such as EPA’s Toxicity Forecaster (ToxCast) and related inter-agency governmental efforts such as Tox21. This lag is made more salient by the wealth of databases and tools related to epigenomics that have been built over the years. In particular, the NHGRI-funded Encyclopedia of DNA Elements (ENCODE) project, launched in 2003, compiles a comprehensive collection of chromatin features of the human genome including its methylome, open chromatin regions, and histone marks distributions. Its extension effort, ModEncode, complemented the human data by adding large epigenomic datasets for model organisms such as Drosophila and C. elegans. Together with the emergence of these epigenomics datasets, tools to access, manipulate, and add individual datasets also emerged such as the Deepblue, USCS, and WashU Epigenome browsers. One first step to fill in the gap of toxicoepigenomic data in large toxicology databases is to introduce reporting guidelines that would provide a clear framework and format to share data and facilitate its uptake for risk assessment purposes. Most scientific journals request the submission of raw data, which could seem to enable full data access to epigenomic data for risk assessment purposes. Of course, to entirely reanalyze this data for risk assessment purposes would represent an enormous task that does not fit within the scope of risk assessment. Reporting guidelines offer a manageable middle ground. A reporting guideline is “a checklist, flow diagram, or structured text to guide authors in reporting a specific type of research, developed using explicit methodology”; it is a guidance for writing that “provides a minimum list of information needed” to ensure a manuscript can be understood, replicated, used or included in a systematic review. There are currently few reporting guidelines governing data production and interpretation and preferred tools in toxicoepigenetics. Initiatives in that direction have been observed for the ‘omics sciences, in particular with the Transcriptomics Reporting Framework (TRF) under the auspices of the ECETOC and OECD. While not prescriptive, the TRF defines parameters that should be reported and standards for bioinformatic processing and statistical analysis in ‘omics studies to be used in a regulatory context. Its “embedded RBA [Reference Baseline Analysis] stipulates one specific approach to convert the experimental measures (the raw data) to the processed data ready for interpretation.” (pS37) Reporting guidelines have been shown to positively impact the quality of data reporting in preclinical animal studies and in particular reporting of randomization, blinding, and sample-size estimation.[89,90] The application of such standards to epigenomics could help form a consistent and interpretable data stream for risk assessment. Thanks, in particular, to its structure as a consortium involving laboratories with a legacy of research in epigenetics and toxicoepigenetics, the NIEHS and NIDA-funded Toxicant Exposures and Responses by Genomic and Epigenomic Regulators of Transcription (TaRGET) Program could help catalyze the production of such standards for the field through the publication of policies, pipelines, and quality control parameters. Secondly, and importantly, scientists producing toxicoepigenetic data will increase the probability of its being used by risk assessors if they use experimental and analytic tools that are accepted and used by risk assessment. Such shared tools are so far lacking in the field. The use of shared software has proven fruitful for the uptake of gene expression data by risk assessment, gene expression that, although imperfect, can serve as a proxy for epigenetic changes. In particular, the integration of transcriptomic datasets into a benchmark dose (BMD) modeling through the BMDexpress software allows for the determination of a point of departure using changes in gene expression as a molecular endpoint. This funneling of a data type into standard values that are interpretable by regulators in the context of risk assessment is precisely what is currently lacking for toxicoepigenomic data. The program TaRGET takes one step in this direction by making available softwares to standardize various pipeline and establish correlations between chromatin states through ATAQ-seq and gene expression data via RNA-seq pipelines, data that might later be integrated in a BMD framework.[92,93] While this is not an explicitly mentioned goal of the TaRGET consortium, these efforts pave the way toward a long-awaited leveraging of these kinds of data for risk assessment.

Support the transfer of knowledge

Efforts to promote toxicoepigenetics for risk assessment have remained curbed by a high knowledge barrier for scientists who are not active in the field. A recent survey of the views on epigenetics of 40 EU regulatory experts and toxicologists found that the majority of respondents deemed their own scientific expertise insufficient to evaluate the regulatory benefits of epigenetics. Asked about the advantage and feasibility of including epigenetic endpoints into OECD Test Guidelines for endocrine disruptors, a combined 53% to 65% of regulatory experts responded either “don’t know” or “no reply.” Comments to these answers underlined the gap between current research in epigenetics and their appropriation in the risk assessment space—“I do not have any experience with the potential use of epigenetic tests”; “Not yet sufficiently familiar with these tests to comment.” These opinions are reflected more broadly in attitudes of scientists toward alternative testing strategies: while a majority claims a broad commitment to increasing the share of such strategies in their work, few are in fact regularly using them, citing technical barriers and uncertainty over their regulatory acceptance. Learning about toxicoepigenetic assays requires a considerable investment for outsiders to the field. It is poised to remain at a low level of priority for risk assessors and regulatory scientists as long as epigenomic data is not part of regulatory science. However, in a tautological fashion, epigenetics cannot become admissible evidence in regulatory science without risk assessors and agencies being relatively familiar with it and trusting it. Closing this gap will require a deliberate effort on the part of the toxicoepigenetics community. The Toxicology in the 21st Century (Tox21) program explicitly aimed to build stakeholders’ trust in new technologies through concrete efforts of validation and consensus-formation, in particular through workshops.[70,96] The field of toxicogenomics offers a precedent for such a concerted effort. While in vivo assays continue to be the gold standard of risk assessment, toxicogenomic data presently contributes to some assessments in weight of evidence approaches.[97,98] Making toxicogenomics relevant to risk assessment involved specific efforts during the 2000s to increase mutual knowledge in both fields. As sociologist Shostak showed, this required both institutional and individual commitments. Institutionally, this effort was led by the National Center for Toxicogenomics that was founded at that time and benefited from appropriate funding to enable infrastructure in terms of archives, software, and working groups. It also required the personal commitment of scientists with different backgrounds and methodologies to gain sufficient understanding of either computational biology or pathology to be able to work with their counterparts. A small, yet active community of risk assessors and basic scientists is currently working to further the relationship between toxicoepigenetics and risk assessment, in particular through workshops gathering basic scientists and risk assessors, and proceedings. Such efforts are key to developing concrete steps, such as assays and protocols, that support the incorporation of epigenetics into the arsenal of risk assessment. Their success depends on the broad involvement of the toxicoepigenetics community but also on institutional support. It may also be key for promoters of toxicoepigenetics to work directly with regulatory experts. International testing methods are standardized by the OECD Test Guidelines Program. Here again, precedents exist for the OECD Test Guidelines Program promoting the credibility and value of emerging testing methods, such as structure-activity relationships (SARs). In the early 1990s, structure-activity specialist Gil Veith approached OECD representatives with a software package that included a whole collection of “ready-to-use” models. His efforts, akin to scientific lobbying work, led to the use of SARs in American and European regulatory agencies a few years later. By directly involving international regulatory experts, promoters of toxicoepigenetics may foster in-depth discussions in these institutional arenas where test guidelines are discussed and standardized.

Open to other stakeholders

Beside large governmental endeavors, non-expert stakeholders may have a role to play in supporting the use of toxicoepigenetic data in risk assessment. One specificity of epigenetics is that it includes forms of toxicity that were not previously documented, in particular long-term and low-dose effects of exposure. While some chemical companies use epigenetic data internally, they have little incentive to investigate these effects for risk assessment purposes in the absence of any legal obligation. On the contrary, research in toxicoepigenetics may command the attention of non-governmental organizations, in particular environmental NGOs, consumer defense organizations, and patient advocate organizations. The joint mobilization of NGOs and citizen scientists, scientists, and policy-makers was critical in recognizing and starting to address endocrine disruptors in the past 2 decades.[100,101] While epigenetics is not a focus point for environmental NGOs yet, long-term and intergenerational epigenetic changes have captured the attention of the media and the public. The Escher Fund for Autism, a patient organization, has pioneered support to research and regulatory application of germline epigenetics by funding and organizing science as well as lobbying regulatory agencies. Because legal mandates are critical to the regulation of chemicals, non-expert members of the public have a significant role to play in this area. The mobilization of these actors, who regularly function as networks and are able to talk to the media, could help make epigenetic exposure in such cases as arsenic or endocrine disruptors become a legitimate and visible public health issue commanding the attention of governments and agencies. The mobilization of NGOs and advocates depends on the participation of academic and regulatory scientists to help them document the limits of current approaches to toxicity testing.

Conclusion

Toxicoepigenetics is a highly dynamic area of research, poised to increasingly demonstrate the importance of epigenome-mediated toxicity. It has a strong potential to help toxicology meet the challenge of assessing large volumes of compounds and novel forms of toxicity. While it may in the short-term help prioritizing chemicals for further investigation and contribute to the weight of evidence, it could in the long-term help address lifelong, inter/multi/transgenerational, and low-dose exposure effects. Toxicoepigenetics is also a relatively young area of science with an often commented-on need for standardization of methods to generate, process, and interpret data. While we can expect increasing standardization from the scientific maturation of the field, this is not the only reason for the meager integration of toxicoepigenomic methods and data into risk assessment procedures. More profoundly, it is impeded by a divergence between basic science and the evidential culture of risk assessment. For risk assessment to be able to leverage toxicoepigenetic data, this data needs to fit criteria of relevance and validity that guide the policy-driven process of risk assessment in addition to scientific standards of robustness and comparability. In other words, the challenge is not only of doing more science, but doing science differently, in a way that acknowledges the constraints of risk assessment. Deliberate efforts to address areas of divergence by stakeholders, at the combined levels of individuals, institutions, and public discussion, are needed. Concrete strategies to improve communication and data interpretability through the use of tools and standards that are shared between researchers and risk assessors constitute key steps to foster the use of toxicoepigenomic approaches in risk assessment. The dividing line between basic and regulatory science that we describe here is not specific to toxicoepigenomics. While the divergence is less pronounced in the well-established branches of toxicology that form the core of risk assessment assays, it becomes a major hurdle in novel areas. The case study of toxicoepigenetics can inform innovation strategies in the ‘omics and in turn inform the future of risk assessment.
  67 in total

1.  BMDExpress 2: enhanced transcriptomic dose-response analysis workflow.

Authors:  Jason R Phillips; Daniel L Svoboda; Arpit Tandon; Shyam Patel; Alex Sedykh; Deepak Mav; Byron Kuo; Carole L Yauk; Longlong Yang; Russell S Thomas; Jeff S Gift; J Allen Davis; Louis Olszyk; B Alex Merrick; Richard S Paules; Fred Parham; Trey Saddler; Ruchir R Shah; Scott S Auerbach
Journal:  Bioinformatics       Date:  2019-05-15       Impact factor: 6.937

Review 2.  Environmental toxicants--induced epigenetic alterations and their reversers.

Authors:  Minju Kim; Minji Bae; Hyunkyung Na; Mihi Yang
Journal:  J Environ Sci Health C Environ Carcinog Ecotoxicol Rev       Date:  2012       Impact factor: 3.781

3.  Advancing the use of noncoding RNA in regulatory toxicology: Report of an ECETOC workshop.

Authors:  Achim Aigner; Roland Buesen; Tim Gant; Nigel Gooderham; Helmut Greim; Jörg Hackermüller; Bruno Hubesch; Madeleine Laffont; Emma Marczylo; Gunter Meister; Jay S Petrick; Reza J Rasoulpour; Ursula G Sauer; Kerstin Schmidt; Hervé Seitz; Frank Slack; Tokuo Sukata; Saskia M van der Vies; Jan Verhaert; Kenneth W Witwer; Alan Poole
Journal:  Regul Toxicol Pharmacol       Date:  2016-09-20       Impact factor: 3.271

Review 4.  Effects of arsenic toxicity beyond epigenetic modifications.

Authors:  Geir Bjørklund; Jan Aaseth; Salvatore Chirumbolo; Mauricio A Urbina; Riaz Uddin
Journal:  Environ Geochem Health       Date:  2017-05-08       Impact factor: 4.609

Review 5.  Pharmacoepigenetics and Toxicoepigenetics: Novel Mechanistic Insights and Therapeutic Opportunities.

Authors:  Volker M Lauschke; Isabel Barragan; Magnus Ingelman-Sundberg
Journal:  Annu Rev Pharmacol Toxicol       Date:  2017-10-13       Impact factor: 13.820

Review 6.  Environmentally induced epigenetic toxicity: potential public health concerns.

Authors:  Emma L Marczylo; Miriam N Jacobs; Timothy W Gant
Journal:  Crit Rev Toxicol       Date:  2016-06-09       Impact factor: 5.635

7.  Has Toxicity Testing Moved into the 21st Century? A Survey and Analysis of Perceptions in the Field of Toxicology.

Authors:  Virginia Zaunbrecher; Elizabeth Beryt; Daniela Parodi; Donatello Telesca; Joseph Doherty; Timothy Malloy; Patrick Allard
Journal:  Environ Health Perspect       Date:  2017-08-30       Impact factor: 9.031

Review 8.  Incorporating epigenetic data into the risk assessment process for the toxic metals arsenic, cadmium, chromium, lead, and mercury: strategies and challenges.

Authors:  Paul D Ray; Andrew Yosim; Rebecca C Fry
Journal:  Front Genet       Date:  2014-07-16       Impact factor: 4.599

9.  BMDExpress Data Viewer - a visualization tool to analyze BMDExpress datasets.

Authors:  Byron Kuo; A Francina Webster; Russell S Thomas; Carole L Yauk
Journal:  J Appl Toxicol       Date:  2015-12-15       Impact factor: 3.446

Review 10.  Environmentally induced epigenetic transgenerational inheritance of disease.

Authors:  Eric E Nilsson; Ingrid Sadler-Riggleman; Michael K Skinner
Journal:  Environ Epigenet       Date:  2018-07-17
View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.