Literature DB >> 34009726

Quality standards in proteomics research facilities: Common standards and quality procedures are essential for proteomics facilities and their users.

Cristina Chiva1,2, Teresa Mendes Maia3,4,5, Christian Panse6,7, Karel Stejskal8,9, Thibaut Douché10, Mariette Matondo10, Damarys Loew11, Dominic Helm12,13, Mandy Rettel12, Karl Mechtler8,9,14, Francis Impens3,4,5, Paolo Nanni6, Anna Shevchenko15, Eduard Sabidó1,2.   

Abstract

Proteomics research infrastructures and core facilities within the Core for Life alliance advocate for community policies for quality control to ensure high standards in proteomics services.
© 2021 The Authors. Published under the terms of the CC BY 4.0 license.

Entities:  

Mesh:

Year:  2021        PMID: 34009726      PMCID: PMC8183401          DOI: 10.15252/embr.202152626

Source DB:  PubMed          Journal:  EMBO Rep        ISSN: 1469-221X            Impact factor:   8.807


Core facilities and research infrastructures have become an essential part of the scientific ecosystem. In the field of proteomics, national and international networks and research platforms have been established during the past decade that are supposed to set standards for high‐quality services, promote an exchange of professional information, and enable access to cutting‐edge, specialized proteomics technologies. Either centralized or distributed, these national and international proteomics infrastructures and technology platforms are generating massive amounts of data for the research community, and support a broad range of translational, computational and multi‐omics initiatives and basic research projects. By delegating part of their work to these services, researchers expect that the core facility adjusts their analytical protocols appropriately for their project to acquire data conforming best research practice of the scientific community. The implementation of quality assessment measures and commonly accepted quality controls in data generation is therefore crucially important for proteomics research infrastructures and the scientists who rely on them. However, current quality control and quality assessment procedures in proteomics core facilities and research infrastructures are a motley collection of protocols, standards, reference compounds and software tools. Proteomics relies on a customized multi‐step workflow typically consisting of sample preparation, data acquisition and data processing, and the implementation of each step differs among facilities. For example, sample preparation involves enzymatic digestion of the proteins, which can be performed in‐solution, in‐gel, or on‐beads, with often different proteolytic enzymes, chemicals, and conditions among laboratories. Data acquisition protocols are often customized to the particular instrument set up, and the acquired spectra and chromatograms are processed by different software tools provided by equipment vendors, third parties or developed in‐house. …current quality control and quality assessment procedures in proteomics core facilities and research infrastructures are a motley collection of protocols, standards, reference compounds and software tools. Moreover, core facilities implement their own guidelines to monitor the performance and quality of the entire workflow, typically utilizing different commercially available standards such as pre‐digested cell lysates, recombinant proteins, protein mixtures, or isotopically labeled peptides. Currently, there is no clear consensus on if, when and how to perform quality control checks. There is even less quality control in walk‐in facilities, where the staff is only responsible for correct usage of the instruments and users select and execute the analytical workflow themselves. It is not surprising therefore that instrument stability and robustness of the applied analytical approach are often unclear, which compromises analytical rigor.

Establishing standardized practices

Initiated by the HUPO Proteomics Standard Initiative (PSI) more than a decade ago, MIAPE guidelines (Minimal Information about Proteomics Experiment; Taylor et al, 2007) introduced common formats for sharing and reporting proteomics data, including unrestricted access to raw data at public repositories (Vizcaíno et al, 2014). Supported by journals’ guidelines that request the deposition of raw data into such repositories as a condition for publication, these repositories have grown into a rich resource for data mining and multi‐omics integration. However, MIAPE guidelines did not imply quality metrics and there is still no generic tool capable of independently ascertaining the technical quality of the deposited data. The importance of quality assessments for open‐access proteomics was highlighted in the Amsterdam Principles more than 10 years ago (Rodriguez et al, 2009), but the development of quality threshold metrics was delegated to central repositories. A few years later, the Sydney workshop convened by the US National Cancer Institute made recommendations and formulated key principles for data quality metrics, and journal editors and reviewers were supposed to encourage or enforce their implementation in practice (Kinsinger et al, 2012). The corollary recognized “the need for formal comparison of methods on equal footing” thus alluding for the first time to a common quality control. More recently, recommendations for quality control metrics have indeed been included in publishing guidelines (Abbatiello et al, 2017). The need for common quality assessment protocols in scientific infrastructures has also been emphasized by international research associations. The Association of Biomolecular Research Facilities (ABRF), Core Technologies for Life Sciences (CTLS), Core for Life (C4L), and the Clinical Proteomic Tumor Analysis Consortium (CPTAC), initiated discussions and development of common quality procedures, and continuously promote sharing of best practices. A recent comprehensive survey among research facilities across Europe showed that the majority of core facilities do recognize the need and importance of quality controls (Kos‐Braun et al, 2020). However, we believe that the issue of systematic quality procedures in proteomics infrastructures still has not received the public attention it deserves. Moreover, we maintain that community efforts toward quality control and quality assessment are not sufficiently organized to achieve systematic agreement, despite the availability of methods for the evaluation of analytical protocols, intra‐ and inter‐laboratory comparison of reproducibility, and software tools for automated monitoring of instrument performance. A recent comprehensive survey among research facilities across Europe showed that the majority of core facilities do recognize the need and importance of quality controls.

Common quality control procedures

Common quality control procedures in proteomics core facilities ensure technical quality, reproducibility, comparability, and data integrity. A representative example of how these benefit the coordinated work of several proteomics units is the dissection and validation of a SARS protein interaction map (Gordon et al, 2020). Common quality controls foster reuse of resources, protect against bias in experimental design and improve daily routines (Fig 1). Systematic assessment of instrument performance, early recognition of poor‐quality data, and monitoring carry‐over and background signals enable long‐term robustness and reproducibility of the proteomics workflow and leverage the impact of aging instruments or turnaround of the laboratory staff.
Figure 1

Benefits of common quality procedures

Main benefits of the implementation of common quality procedures in proteomics research infrastructures and core facilities for the users and customers, the scientific community, and the infrastructures themselves.

Benefits of common quality procedures

Main benefits of the implementation of common quality procedures in proteomics research infrastructures and core facilities for the users and customers, the scientific community, and the infrastructures themselves. Quality control procedures should be generic and flexible and support diverse workflows and instrumental platforms. Core facilities and research infrastructures are technology hubs and their operations are not, and should not, be limited to routine analytical measurements. Diversity of model organisms, scales, and research goals of the scientific community generate numerous project‐specific protocols and great variability of workflows. Instrumentation platforms and analytical software will also remain diverse and heterogeneous in the foreseeable future. The choice of mass spectrometry equipment is often not only defined by scientific requirements but also influenced by the availability of funding and results in a collection of instruments of different generations, types, and vendors within the same facility. Core facilities and research infrastructures are technology hubs and their operations are not, and should not, be limited to routine analytical measurements. Quality procedures should be therefore organized into a framework that accommodates these diverse workflows and instrumental platforms. Such a framework should rely on common commercially available protein and peptide standards that—alone or spiked into the samples—are systematically analyzed for values relevant for quality control, such as the number identified of proteins, retention time and intensity of the peptide chromatographic peaks, ratio, or fold change of endogenous and isotopically labeled reference peptides. These repeated test runs would document the analytical performance of the entire workflow applied to an individual sample or sample batch. The information would be highly valuable for detecting random failures, monitoring instrument stability, and ensuring the reproducibility of repeated analyses, but also for continuous methods optimization and developing new methods. Moreover, common quality control parameters should be submitted to repositories along with the raw result data as required by MIAPE guidelines. … common quality control parameters should be submitted to repositories along with the raw result data as required by MIAPE guidelines. The frequent analysis of standard samples would also help to generate laboratory‐based average references that monitor the performance drift of instruments and indicate whether the settings applied to analytical and computational workflow are optimal or would need readjustment. Such records could help to diagnose instrument malfunction and to test the performance of new instruments. In the future, software tools could be integrated with both instruments and reference data repositories to streamline the collection and management of quality control values. The aforementioned procedures could be a step towards ISO (International Organization for Standardization) or other quality certifications for core facilities and research infrastructures that require it.

The role of funders and technology providers

The development and implementation of common quality management schemes pose a challenge for the entire proteomics community and require support from technology providers and funding agencies. The proteomics community must therefore define a common set of quality parameters, standards, controlled vocabulary, and generic file formats to support collective testing and anonymized evaluation of the results. They should also work with technology providers to implement quality checks in vendors’ software. Last but not least, the community should approach national and international funding bodies to raise awareness of the importance of common quality control procedure in order to secure their financial support. Research infrastructures and core facilities are in the position to drive initiatives that require extensive collaboration and concerted efforts. Within the Core for Life alliance (Meder et al, 2016; Lippens et al, 2019) (https://coreforlife.eu), our proteomics laboratories advocate for community policies for quality control procedures to ensure the high standards in proteomics services. Among other initiatives, we have developed and endorsed the QCloud tool as a cross‐platform open‐source quality control software for systematic monitoring of instrument performance (Chiva et al, 2018; Olivella et al, 2021). However, there is a further need in developing automated, user‐friendly, and flexible routines suitable for inter‐laboratory collection of quality data that satisfy data protection requirements and remain affordable for the broader research community. These and other practical measures require understanding and support of the entire proteomics community, funding, and publishing bodies.
  12 in total

Review 1.  The minimum information about a proteomics experiment (MIAPE).

Authors:  Chris F Taylor; Norman W Paton; Kathryn S Lilley; Pierre-Alain Binz; Randall K Julian; Andrew R Jones; Weimin Zhu; Rolf Apweiler; Ruedi Aebersold; Eric W Deutsch; Michael J Dunn; Albert J R Heck; Alexander Leitner; Marcus Macht; Matthias Mann; Lennart Martens; Thomas A Neubert; Scott D Patterson; Peipei Ping; Sean L Seymour; Puneet Souda; Akira Tsugita; Joel Vandekerckhove; Thomas M Vondriska; Julian P Whitelegge; Marc R Wilkins; Ioannnis Xenarios; John R Yates; Henning Hermjakob
Journal:  Nat Biotechnol       Date:  2007-08       Impact factor: 54.908

2.  One step ahead: Innovation in core facilities.

Authors:  Saskia Lippens; Christophe D'Enfert; Lilla Farkas; Anna Kehres; Bernhard Korn; Mònica Morales; Rainer Pepperkok; Lavanya Premvardhan; Ralph Schlapbach; Andreas Tiran; Doris Meder; Geert Van Minnebruggen
Journal:  EMBO Rep       Date:  2019-03-14       Impact factor: 8.807

3.  QCloud2: An Improved Cloud-based Quality-Control System for Mass-Spectrometry-based Proteomics Laboratories.

Authors:  Roger Olivella; Cristina Chiva; Marc Serret; Daniel Mancera; Luca Cozzuto; Antoni Hermoso; Eva Borràs; Guadalupe Espadas; Julia Morales; Olga Pastor; Amanda Solé; Julia Ponomarenko; Eduard Sabidó
Journal:  J Proteome Res       Date:  2021-03-16       Impact factor: 4.466

4.  New Guidelines for Publication of Manuscripts Describing Development and Application of Targeted Mass Spectrometry Measurements of Peptides and Proteins.

Authors:  Susan Abbatiello; Bradley L Ackermann; Christoph Borchers; Ralph A Bradshaw; Steven A Carr; Robert Chalkley; Meena Choi; Eric Deutsch; Bruno Domon; Andrew N Hoofnagle; Hasmik Keshishian; Eric Kuhn; Daniel C Liebler; Michael MacCoss; Brendan MacLean; D R Mani; Hendrik Neubert; Derek Smith; Olga Vitek; Lisa Zimmerman
Journal:  Mol Cell Proteomics       Date:  2017-02-09       Impact factor: 5.911

5.  Recommendations for mass spectrometry data quality metrics for open access data (corollary to the Amsterdam Principles).

Authors:  Christopher R Kinsinger; James Apffel; Mark Baker; Xiaopeng Bian; Christoph H Borchers; Ralph Bradshaw; Mi-Youn Brusniak; Daniel W Chan; Eric W Deutsch; Bruno Domon; Jeff Gorman; Rudolf Grimm; William Hancock; Henning Hermjakob; David Horn; Christie Hunter; Patrik Kolar; Hans-Joachim Kraus; Hanno Langen; Rune Linding; Robert L Moritz; Gilbert S Omenn; Ron Orlando; Akhilesh Pandey; Peipei Ping; Amir Rahbar; Robert Rivers; Sean L Seymour; Richard J Simpson; Douglas Slotta; Richard D Smith; Stephen E Stein; David L Tabb; Danilo Tagle; John R Yates; Henry Rodriguez
Journal:  J Proteome Res       Date:  2011-12-08       Impact factor: 4.466

6.  Recommendations from the 2008 International Summit on Proteomics Data Release and Sharing Policy: the Amsterdam principles.

Authors:  Henry Rodriguez; Mike Snyder; Mathias Uhlén; Phil Andrews; Ronald Beavis; Christoph Borchers; Robert J Chalkley; Sang Yun Cho; Katie Cottingham; Michael Dunn; Tomasz Dylag; Ron Edgar; Peter Hare; Albert J R Heck; Roland F Hirsch; Karen Kennedy; Patrik Kolar; Hans-Joachim Kraus; Parag Mallick; Alexey Nesvizhskii; Peipei Ping; Fredrik Pontén; Liming Yang; John R Yates; Stephen E Stein; Henning Hermjakob; Christopher R Kinsinger; Rolf Apweiler
Journal:  J Proteome Res       Date:  2009-07       Impact factor: 4.466

7.  Institutional core facilities: prerequisite for breakthroughs in the life sciences: Core facilities play an increasingly important role in biomedical research by providing scientists access to sophisticated technology and expertise.

Authors:  Doris Meder; Mònica Morales; Rainer Pepperkok; Ralph Schlapbach; Andreas Tiran; Geert Van Minnebruggen
Journal:  EMBO Rep       Date:  2016-07-13       Impact factor: 8.807

8.  A survey of research quality in core facilities.

Authors:  Isabelle C Kos-Braun; Björn Gerlach; Claudia Pitzer
Journal:  Elife       Date:  2020-11-26       Impact factor: 8.140

9.  ProteomeXchange provides globally coordinated proteomics data submission and dissemination.

Authors:  Juan A Vizcaíno; Eric W Deutsch; Rui Wang; Attila Csordas; Florian Reisinger; Daniel Ríos; José A Dianes; Zhi Sun; Terry Farrah; Nuno Bandeira; Pierre-Alain Binz; Ioannis Xenarios; Martin Eisenacher; Gerhard Mayer; Laurent Gatto; Alex Campos; Robert J Chalkley; Hans-Joachim Kraus; Juan Pablo Albar; Salvador Martinez-Bartolomé; Rolf Apweiler; Gilbert S Omenn; Lennart Martens; Andrew R Jones; Henning Hermjakob
Journal:  Nat Biotechnol       Date:  2014-03       Impact factor: 54.908

10.  A SARS-CoV-2 protein interaction map reveals targets for drug repurposing.

Authors:  David E Gordon; Gwendolyn M Jang; Mehdi Bouhaddou; Jiewei Xu; Kirsten Obernier; Kris M White; Matthew J O'Meara; Veronica V Rezelj; Jeffrey Z Guo; Danielle L Swaney; Tia A Tummino; Ruth Hüttenhain; Robyn M Kaake; Alicia L Richards; Beril Tutuncuoglu; Helene Foussard; Jyoti Batra; Kelsey Haas; Maya Modak; Minkyu Kim; Paige Haas; Benjamin J Polacco; Hannes Braberg; Jacqueline M Fabius; Manon Eckhardt; Margaret Soucheray; Melanie J Bennett; Merve Cakir; Michael J McGregor; Qiongyu Li; Bjoern Meyer; Ferdinand Roesch; Thomas Vallet; Alice Mac Kain; Lisa Miorin; Elena Moreno; Zun Zar Chi Naing; Yuan Zhou; Shiming Peng; Ying Shi; Ziyang Zhang; Wenqi Shen; Ilsa T Kirby; James E Melnyk; John S Chorba; Kevin Lou; Shizhong A Dai; Inigo Barrio-Hernandez; Danish Memon; Claudia Hernandez-Armenta; Jiankun Lyu; Christopher J P Mathy; Tina Perica; Kala Bharath Pilla; Sai J Ganesan; Daniel J Saltzberg; Ramachandran Rakesh; Xi Liu; Sara B Rosenthal; Lorenzo Calviello; Srivats Venkataramanan; Jose Liboy-Lugo; Yizhu Lin; Xi-Ping Huang; YongFeng Liu; Stephanie A Wankowicz; Markus Bohn; Maliheh Safari; Fatima S Ugur; Cassandra Koh; Nastaran Sadat Savar; Quang Dinh Tran; Djoshkun Shengjuler; Sabrina J Fletcher; Michael C O'Neal; Yiming Cai; Jason C J Chang; David J Broadhurst; Saker Klippsten; Phillip P Sharp; Nicole A Wenzell; Duygu Kuzuoglu-Ozturk; Hao-Yuan Wang; Raphael Trenker; Janet M Young; Devin A Cavero; Joseph Hiatt; Theodore L Roth; Ujjwal Rathore; Advait Subramanian; Julia Noack; Mathieu Hubert; Robert M Stroud; Alan D Frankel; Oren S Rosenberg; Kliment A Verba; David A Agard; Melanie Ott; Michael Emerman; Natalia Jura; Mark von Zastrow; Eric Verdin; Alan Ashworth; Olivier Schwartz; Christophe d'Enfert; Shaeri Mukherjee; Matt Jacobson; Harmit S Malik; Danica G Fujimori; Trey Ideker; Charles S Craik; Stephen N Floor; James S Fraser; John D Gross; Andrej Sali; Bryan L Roth; Davide Ruggero; Jack Taunton; Tanja Kortemme; Pedro Beltrao; Marco Vignuzzi; Adolfo García-Sastre; Kevan M Shokat; Brian K Shoichet; Nevan J Krogan
Journal:  Nature       Date:  2020-04-30       Impact factor: 69.504

View more
  1 in total

1.  Quality standards in proteomics research facilities: Common standards and quality procedures are essential for proteomics facilities and their users.

Authors:  Cristina Chiva; Teresa Mendes Maia; Christian Panse; Karel Stejskal; Thibaut Douché; Mariette Matondo; Damarys Loew; Dominic Helm; Mandy Rettel; Karl Mechtler; Francis Impens; Paolo Nanni; Anna Shevchenko; Eduard Sabidó
Journal:  EMBO Rep       Date:  2021-05-19       Impact factor: 8.807

  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.