| Literature DB >> 33057427 |
Claudia Kurreck1, Esmeralda Castaños-Vélez2, Dorette Freyer1, Sonja Blumenau1, Ingo Przesdzing1,3, Rene Bernard1,2,3, Ulrich Dirnagl1,2,3.
Abstract
How much can we rely on whether what was reported in a study was actually done? Systematic and independent examination of records, documents and processes through audits are a central element of quality management systems. In the context of current concerns about the robustness and reproducibility of experimental biomedical research audits have been suggested as a remedy a number of times. However, audits are resource intense and time consuming, and due to their very nature may be perceived as inquisition. Consequently, there is very little experience or literature on auditing and assessments in the complex preclinical biomedical research environment. To gain some insight into which audit approaches might best suit biomedical research in academia, in this study we have applied a number of them in a typical academic neuroscience environment consisting of twelve research groups with about 100 researchers, students and technicians, utilizing the full gamut of state-of-the-art methodology. Several types of assessments and internal as well as external audits (including the novel format of a peer audit) were systematically explored by a team of quality management specialists. An experimental design template was developed (and is provided here) that takes into account and mitigates difficulties, risks and systematic errors that may occur during the course of a study. All audits were performed according to a pre-defined workflow developed by us. Outcomes were assessed qualitatively. We asked for feedback from participating employees in every final discussion of an audit and documented this in the audit reports. Based on these reports follow-up audits were improved. We conclude that several realistic options for auditing exist which have the potential to improve preclinical biomedical research in academia, and have listed specific recommendations regarding their benefits and provided practical resources for their implementation (e.g. study design and audit templates, audit workflow).Entities:
Mesh:
Year: 2020 PMID: 33057427 PMCID: PMC7561085 DOI: 10.1371/journal.pone.0240719
Source DB: PubMed Journal: PLoS One ISSN: 1932-6203 Impact factor: 3.240
Types of audits and assessments performed at the Department of Experimental Neurology.
For a more detailed description of specific assessments and audits, see Table 2.
| Assessments / Audits (Aim) | Who | What | How | Update cycle | |
|---|---|---|---|---|---|
| QM competent personnel; scientist |
specific aspects of a scientific project or study key and support process items for which a risk exists introduction of new standards |
once a year for pre-identified risks and ad hoc when research environment changes significantly | |||
| QM competent personnel; trained scientist |
evaluation of usefulness and effectiveness of implemented measures identification of gaps |
checklists documents (e.g. SOPs) study plan / protocol |
once to several times a year | ||
| internal auditor of the Charité | preparation for certification and monitoring audits |
checklists documents (e.g. SOPs) | once a year, prior to external audits | ||
| qualified scientist |
QM system review of research projects and processes |
questionnaires checklists documents (e.g. SOPs) study plan / protocol |
once during a project | ||
| external auditor | validation of QM system research projects and processes |
checklists ISO-norm |
once a year in a certification cycle | ||
*FMEA: Failure Mode and Effects Analysis [25]
**LabCIRS: Laboratory Critical Incident Reporting System [24].
Description of the assessments and audits listed in Table 1.
| Assessments / Audits | When / How often | Method | Objectives | |
|---|---|---|---|---|
| during the ISO-certification, three times (once a year) | FMEA: All identified risks were recorded by criteria such as flawed methods or results during project progress, project delay, error, loss of data records, etc. The risk score was determined by multiplying the influencing variables’ "probability of the occurrence of the risk", "significance for practice when the risk occurs" and "probability of the discovery of the risk that occurred". The higher this figure is, the greater the significance of the risk and the earlier measures had to be taken to avoid it. |
evaluation of processes that run the risk of not fulfilling quality, safety or legal requirements | ||
| ongoing process for six years |
started with a list of errors placed in every laboratory after two years, implementation of LabCIRS, an anonymous, free, open-source online tool |
identification of errors in the daily work routine | ||
| once to several times a year: Methods: 5x | Methods: Core methods (e.g. middle cerebral artery occlusion, preparation of primary cultures of neurons) have been described by scientists through SOPs or working instructions. A member of another working group, but also expert on the described method, examined and questioned the specific implementation, discussed the results and made suggestions for improvement if necessary. In this way, SOPs were checked for deviations or gaps. Data: The archiving procedures of the primary data, and the adherence and practicability of the related SOP were reviewed. Documentation: Project managers, scientist and QM personnel checked whether the documentation requirements were being fulfilled. Processes: A specially developed, detailed experimental/project-planning tool, available as a template in the electronic laboratory notebook, was tested and validated on two projects. | evaluation of methods, processes and data to be improved comparison of methods verification of usefulness and effectiveness of implemented measures validation of relevant changes to on-going processes or projects review of compliance with own requirements | ||
| during the ISO-certification, three times (once a year) before an external audit | These regular onsite visits were carried out in teams of two to prepare for the certification and subsequent monitoring audits. Spot checks were carried out to ensure compliance with ISO 9001 and included document management, checklists, forms or the results of key performance indicators. |
evaluation of compliance with the ISO-norm | ||
| once, in three working groups of the department | In this unidirectional peer audit, two research groups with expertise in the same field exchanged methods and protocols, reviewed the corresponding procedures and evidence of consistency. They compared methodologies, checked for adherence to protocol or published methodology, best practice details, and discussed potential problems. Scientists with and without training in Quality Assurance, but sufficient background in the audited methods checked for compliance with the detailed experimental / project-planning procedures of the audited project, as specified in the electronic laboratory notebook via a template. | plausibility checks comparison of methods scientific exchange review of projects and processes | ||
| three times, during the ISO-certification | A certification body verified that our processes, personnel, and management system were compliant with the ISO 9001 requirements of the quality management system (QMS). |
check whether the QMS is up and running as specified certification | ||
Examples for types of audits and assessments, which were considered, but not performed at the Department of Experimental Neurology.
| Non-Performed Assessments/Audits | not performed because: | Annotations |
|---|---|---|
|
complex of topics too important for self-checks is carried out in the internal audit by quality personnel or trained scientists | ||
|
a supplier audit (aiming to check and improve the current quality and delivery processes) is not important for an academic research institution performance audits (check of an entity’s operations to determine if specific programs or functions are working as intended to achieve stated goals) and compliance audits (review of an organization’s adherence to regulatory guidelines) were covered by the certification and monitoring audits | important audits for industry | |
| The Department of Experimental Neurology has no accreditation. | only service laboratories are accredited |
Fig 1Audit workflow.
This audit workflow describes the four steps performed during an audit in an academic preclinical research environment, but is universally applicable to all areas in which audits are performed. This audit workflow is a practical example of Deming’s PDCA (Plan-Do-Check-Act) cycle, which incorporates the process of continuous improvement.
Results of audits and assessments performed in the Department of Experimental Neurology.
| Assessments / Audits | Advantages | Disadvantages | |
|---|---|---|---|
|
establishes risk awareness existing risks identified and recorded someone must track the measures to prevent risks |
complex and time consuming risk scores are determined arbitrarily risks can be avoided or mitigated | ||
|
establishes awareness for error reporting allows learning from errors systematic errors can be prevented creation of a transparent error culture anonymous online tool accepted by researches |
needs a person responsible for communicating the reported errors and verifying actions taken to prevent recurrence | ||
|
can be carried out ad hoc by QM competent personnel or trained scientist when needed identification of usefulness and effectiveness of implemented measures | at least one responsible person within the organization need to have an overview of where and when internal audits are required | ||
|
prepares for certification and monitoring audits provides external view auditors come from the same organization and are familiar with organization-specific processes and institutions |
requires reciprocal audit in another facility of the institution / laboratory not intended to check in detail any specific procedure mainly designed for checking the requirements of a certification but not scientific content needs a person who is trained as an internal auditor | ||
|
external independent expert view professional exchange at eye level fosters scientific collaboration can positively influence the outcome of a project or process raises awareness among researchers of specific quality issues provides evidence on the effectiveness and transparency of the scientific process accepted by researchers |
time consuming for auditors (who have to attend their own research projects) and auditees travel expenses if the auditors come from another city not always easy to find suitable audit partners requires technological and methodological understanding of the research context from the auditor | ||
verifies that all requirements of the norm and legal regulations are fulfilled |
the contents of the research processes are only checked to a limited extent works only as part of a system high certification costs | ||
Fig 2Graphical representation of the number of advantages and disadvantages of audits and assessments as perceived by members of the department.