| Literature DB >> 30713660 |
Ellen Ingre-Khans1, Marlene Ågerstrand1, Anna Beronius2, Christina Rudén1.
Abstract
Regulatory authorities rely on hazard and risk assessments performed under REACH for identifying chemicals of concern and to take action. Therefore, these assessments must be systematic and transparent. This study investigates how registrants evaluate and report data evaluations under REACH and the procedures established by the European Chemicals Agency (ECHA) to support these data evaluations. Data on the endpoint repeated dose toxicity were retrieved from the REACH registration database for 60 substances. An analysis of these data shows that the system for registrants to evaluate data and report these evaluations is neither systematic nor transparent. First, the current framework focuses on reliability, but overlooks the equally important aspect of relevance, as well as how reliability and relevance are combined for determining the adequacy of individual studies. Reliability and relevance aspects are also confused in the ECHA guidance for read-across. Second, justifications for reliability evaluations were mainly based on studies complying with GLP and test guidelines, following the Klimisch method. This may result in GLP and guideline studies being considered reliable by default and discounting non-GLP and non-test guideline data. Third, the reported rationales for reliability were frequently vague, confusing and lacking information necessary for transparency. Fourth, insufficient documentation of a study was sometimes used as a reason for judging data unreliable. Poor reporting merely affects the possibility to evaluate reliability and should be distinguished from methodological deficiencies. Consequently, ECHA is urged to improve the procedures and guidance for registrants to evaluate data under REACH to achieve systematic and transparent risk assessments.Entities:
Year: 2018 PMID: 30713660 PMCID: PMC6334497 DOI: 10.1039/c8tx00216a
Source DB: PubMed Journal: Toxicol Res (Camb) ISSN: 2045-452X Impact factor: 3.524
The analysis included 349 study summaries for 60 substances that were assigned a reliability category (1 = reliable without restriction; 2 = reliable with restriction; 3 = not reliable; 4 = not assignable) and adequacy (key, supporting, weight of evidence, disregarded study or not specified) that were based either on a study report or publication. “Not specified” indicates that the registrant has not assigned adequacy to the study
| Adequacy | Reliability category 1 | Reliability category 2 | Reliability category 3 | Reliability category 4 | ||||||||
| Study report | Publication | % | Study report | Publication | % | Study report | Publication | % | Study report | Publication | % | |
| Key | 77 | 6 | 78 | 26 | 19 | 27 | 0 | 0 | 0 | 0 | 0 | 0 |
| Supporting | 22 | 1 | 21 | 57 | 52 | 65 | 2 | 5 | 20 | 8 | 3 | 27.5 |
| Weight of evidence | 1 | 0 | 1 | 0 | 3 | 2 | 0 | 0 | 0 | 0 | 1 | 2.5 |
| Disregarded study | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 4 | 14 | 0 | 0 | 0 |
| Not specified | 0 | 0 | 0 | 4 | 6 | 6 | 2 | 21 | 66 | 15 | 13 | 70 |
| Total (reference type) | 100 | 7 | 100 | 87 | 80 | 100 | 5 | 30 | 100 | 23 | 17 | 100 |
| Total (study summary) | 107 (31%) | 167 (48%) | 35 (10%) | 40 (11%) | ||||||||
Fig. 1A. Number of study summaries based on the study report and publication reported to follow GLP and test guidelines. The categories include the pick-list options “according to” and “equivalent or similar to” test guidelines. B. Number of study summaries reported to not follow GLP and test guidelines or for which no information on GLP and test guideline compliance has been reported by the registrant. For GLP, this includes specifically the pick-list options “no GLP” or “no data” (i.e. GLP compliance not reported in the full study report) and for test guidelines “no guideline followed/required/available”. The distinctions within the GLP and test guideline categories are provided in ESI Table f.†
Summary of registrants’ rationales for reliability categorisation. Each study summary is assigned a reliability category and a rationale for reliability. The number of rationales is thus equivalent to the number of study summaries
| Reliability category | No. of rationales | Description of rationales |
| 1 (Reliable without restriction) | 107 | Rationale for reliability explicitly stated in 19 rationales (18%) |
| Restrictions/deficiencies mentioned in 9 rationales, and specified in 1 rationale | ||
| Typical statements: | ||
| [According to/closely adhere to] GLP and/or guideline study (105 rationales) | ||
| The only information in 70 rationales | ||
| Other common statements: | ||
| “Well-documented” | ||
| “Scientifically acceptable/sound” | ||
| “Acceptable scientific principles” | ||
| “Fully adequate for assessment” | ||
| 2 (Reliable with restriction) | 167 | Rationale for reliability explicitly stated in 18 rationales (11%) |
| Restrictions/deficiencies mentioned in 80 rationales, and specified in 47 rationales | ||
| Typical statements: | ||
| [According to, equivalent, comparable similar, near] GLP and/or guideline study (68 rationales) | ||
| [Prior to or non] GLP and/or guideline study (20 rationales) | ||
| GLP and/or guideline study the only information in 16 rationales | ||
| Other common statements: | ||
| “Well-documented” | ||
| “Acceptable for assessment” | ||
| “Meets generally accepted scientific standards” | ||
| “Meets basic scientific principles” | ||
| 3 (Not reliable) | 35 | Rationale for reliability explicitly stated in 3 rationales (9%) but in general the reasons were implicitly understood |
| Restrictions/deficiencies mentioned in 33 rationales, and specified in 20 rationales | ||
| Typical statements: | ||
| “Methodological deficiencies” (10 rationales) | ||
| “Insufficient documentation/data” (16 rationales) | ||
| 4 (Not assignable) | 40 | Rationale for reliability explicitly stated in 3 rationales (8%), but in general the reasons were implicitly understood |
| Typical statements: | ||
| “Insufficient documentation/data” | ||
| “Only abstract available” | ||
| “Documentation not available” (referring to EU RAR and TSCATS) | ||