Literature DB >> 36219622

Development of a checklist of standard items for processing individual participant data from randomised trials for meta-analyses: Protocol for a modified e-Delphi study.

Kylie E Hunter1, Angela C Webster1, Mike Clarke2, Matthew J Page3, Sol Libesman1, Peter J Godolphin4, Mason Aberoumand1, Larysa H M Rydzewska4, Rui Wang5, Aidan C Tan1, Wentao Li5, Ben W Mol5, Melina Willson1, Vicki Brown6, Talia Palacios1, Anna Lene Seidler1.   

Abstract

Individual participant data meta-analyses enable detailed checking of data quality and more complex analyses than standard study-level synthesis of summary data based on publications. However, there is limited existing guidance on the specific systematic checks that should be undertaken to confirm and enhance data quality for individual participant data meta-analyses and how to conduct these checks. We aim to address this gap by developing a checklist of items for data quality checking and cleaning to be applied to individual participant data meta-analyses of randomised trials. This study will comprise three phases: 1) a scoping review to identify potential checklist items; 2) two e-Delphi survey rounds among an invited panel of experts followed by a consensus meeting; and 3) pilot testing and refinement of the checklist, including development of an accompanying R-markdown program to facilitate its uptake.

Entities:  

Mesh:

Year:  2022        PMID: 36219622      PMCID: PMC9553056          DOI: 10.1371/journal.pone.0275893

Source DB:  PubMed          Journal:  PLoS One        ISSN: 1932-6203            Impact factor:   3.752


Introduction

Background and rationale

Individual participant data meta-analyses (IPD-MA) involve the central collection of data for each individual participant in each study included in a systematic review. They offer many advantages over aggregate data meta-analyses, and can provide key evidence to inform guidelines, policy and practice [1]. However, the validity of any meta-analysis depends on the quality of the included studies and their data [2, 3]. One advantage of IPD-MA is that access to raw data allows more detailed data quality checks than are possible when data are synthesised at study level, in aggregate. Addressing any issues identified by these checks enables higher quality data to be included in the meta-analysis which leads to more robust and reliable results. Hence, data quality checks are considered an integral component in the conduct of IPD-MA [4]. Despite this, a recent systematic review evaluating the methodological conduct of all IPD-MA of randomised trials published in English up to September 2019 (n = 323) reported that only 56% conducted checks for invalid, inconsistent, out of range or missing data [5]. This may be due to inadequate resourcing and under-appreciation of the time-intensive nature of such checks [6], which can require at least three to four days per trial and up to several weeks for large and complex datasets [7]. Furthermore, while there is an abundance of literature providing advice on the different types of data quality checks that could/should be conducted, it is not easily actionable. Guidance on which checks should be systematically performed and how these checks should be conducted is lacking. For instance, in the IPD-MA Handbook for Healthcare Research [4], the authors advise that validity, range and consistency of variables should be checked, and recommend using a standardised approach across data checkers, such as following a common checklist and/or statistical code developed for this purpose. However, the specific items on this checklist and how to action them are not elucidated. The PRIME-IPD tool [6, 8, 9] provides a useful framework for preparing IPD for meta-analysis encompassing five steps (Processing, Replication, Imputation, Merging, and Evaluation), but is not prescriptive about how each step should be performed [6]. Authors of PRIME-IPD have acknowledged that the framework needs to be refined and suggest a consensus approach such as a Delphi survey method would be appropriate for this purpose [6, 9].

Objectives

We aim to address this gap by developing an internationally recognised checklist to guide routine data quality checking and cleaning for IPD-MA of randomised trials: the CHecklist for Individual Participant data Processing of Randomised Trials (CHIPPR). While the focus is on randomised trials, many of the checks will also apply to data from other types of studies being brought together for secondary use, such as observational studies. Fig 1 depicts how CHIPPR fits within the overall conduct of an IPD-MA, and distinguishes between three separate but related types of checks that should be undertaken. One type of checks focuses on integrity and identification of potential research misconduct (fabrication, falsification, or plagiarism) [10], which is not within scope of this study and for which a checklist (informed by a scoping review) [11] is already in development. Another type is risk of bias checks, which can be informed by direct checking of IPD, e.g. to assess whether randomisation is robust. The third type is data quality checks and this forms the focus of this paper.
Fig 1

Stages of an individual participant data meta-analysis, highlighting where CHIPPR fits in and interrelated types of data checks (adapted from Rydzewska & Tierney) [4].

We define data quality as the extent to which dimensions of data meet requirements, where a dimension is a measurable feature of a data concept such as accuracy, completeness, consistency and validity (Table 1) [12, 13]. The three types of data checks overlap in Fig 1 to demonstrate that some checks are common, e.g. identification of implausible dates is a common component across data quality and integrity checks; and checking for imbalance in baseline characteristics is common for risk of bias assessments, integrity checks and data quality. Each checklist item will include methods of assessment and decision criteria to guide management of any issues identified. To facilitate uptake of CHIPPR, we will develop an R-markdown program [14-16] to automate some of the processes described in the checklist. This may be adapted to other statistical software (e.g. STATA, SAS) in the future. The ultimate objective is to improve the quality and reliability of IPD-MA.
Table 1

Dimensions of data quality.

Data dimensionDefinitionExamples of non-compliance
Completeness The degree to which i) all required variables, ii) all required records, and iii) all required data values are present in the dataset• Birthweight variable is missing from dataset, but is reported in publication
• Not all randomised participants are present in the dataset
• Birthweight is not provided for every record/participant
Reasonability The degree to which a data pattern meets expectations. Includes time series trends, expectations of randomness and digit preference• Allocation patterns not random
• Despite equal chance a digit may take any value, it is never 5 or mostly 8
• Participant height at time 2 is less than height at time 1
Compliance The degree to which values and variables are in accordance with IPD study codebook either in their original form or after transformation. Includes variable definition, format specification, categories and outcome scales• Nutritional intake should be in kilojoules but is provided in calories
• Post-partum haemorrhage defined as blood loss >600ml rather than >500ml
• Date recorded as mm/dd/yy instead of dd/mm/yyyy
• Sex should be coded male = 1, female = 2, but is female = 1, male = 2
Consistency The degree to which i) data values match corresponding publications or reports (external consistency), or ii) data values of two or more variables within a participant are logical or comply with a rule or equation (internal consistency)• Publication reports 52/168 children had obesity; data shows 49/170
• Protocol states eligible age is ≥18 years, but participant is 15 years old
• Participant body mass index does not equal their weight/height2
• Participant age at baseline was 50, but age at follow-up was 45
Do not confuse consistency with accuracy or correctness
• Hours slept at night plus hours slept during the day is > 24 hours
• Date of follow-up assessment occurs before date of enrolment
Validity The degree to which dates or data values are within a pre-specified or valid range. A data value can be valid but not accurate• An enrolment date that occurs outside of study start and end dates
• Score on a scale of 1–10 is 12
• Gestational age at birth is 54 weeks (outlier), but valid range is 20–45 weeks
Plausibility The degree to which data values match knowledge of the real world (this can be seen as a type of consistency). Values may be possible but not plausible• 90% of participants died despite scoring well on measures of health
• Participant self-reported intense physical activity as 20hrs/day
• 80% of routine visits occurred on a public holiday
Uniqueness The degree to which records occur only once in a dataset• Duplicate participant identifiers
• Identical values for all variables except participant identifier for ≥2 records

Adapted from Black et al. [12, 13].

Adapted from Black et al. [12, 13].

Materials and methods

As shown in Fig 2, this study will comprise three phases: 1) a scoping review to identify potential checklist items; 2) two e-Delphi survey rounds among an invited panel of experts followed by a consensus meeting; and 3) pilot testing and refinement of the checklist, including development of an accompanying R-markdown [14-16] program to facilitate its uptake. The project will be coordinated and overseen by an international steering group of experts in IPD methodology, evidence synthesis, statistics, clinical trial design and data management.
Fig 2

Overview of project phases.

Phase 1: Scoping review to identify potential checklist items

We will undertake a scoping review in accordance with Joanna Briggs Institute (JBI) guidelines [17], to determine which data quality checks researchers perform when processing datasets for inclusion in an IPD-MA of randomised trials. A full protocol for this review, covering each relevant item of the Preferred Reporting Items for Systematic Reviews and Meta-Analyses Extension for Scoping Reviews (PRISMA-ScR) [18], has been preregistered on the Open Science Framework (OSF, https://osf.io/g2unf/). We will include systematic reviews with IPD-MA of randomised trials on intervention effects published in English. These have previously been identified up to September 2019 in a systematic review by Wang et al. [5], and we will update their search to include all others published up to July 2022. Records will be independently screened by two reviewers, with any discrepancies resolved by discussion or, if required, adjudication by a third reviewer. For each eligible record, we will obtain relevant supplementary materials that are attached or referred to in the publication, e.g. statistical analysis plan, data management plan, PROSPERO registration record. We will extract information on characteristics of the included IPD-MA (e.g. year of publication, country, number of included studies) and descriptions of any procedures used to check, clean, transform or prepare individual participant data for analyses, including any software used. We will use a standardised data extraction form which will be piloted by five independent reviewers and revised accordingly prior to commencing full extraction. Where available, the form will be pre-populated by data extracted by Wang et al. [5] to avoid duplication. Data will be extracted by one reviewer with a random subset of 10% of these checked by a second reviewer. Disagreements will be resolved through discussion, with a third reviewer involved if consensus is not reached. Extracted data will be presented in summary tables and figures. Next, we will conduct a brief online survey of authors of these IPD-MA to obtain more detailed information about their data quality checking and cleaning processes. The survey will include the data already extracted for their IPD-MA (as above), and will ask authors to add any other data quality checks that are missing, with an emphasis on how these checks were performed. Reminder emails will be sent after two weeks to non-responders. All identified candidate items will be collated and grouped into common themes and domains (such as randomisation, inconsistencies and missing data) using inductive thematic analysis [19]. Lastly, the study team will draw on their expertise and experience as IPD-MA experts, data managers, statisticians, methodologists and clinicians to add new items, exclude items beyond the scope of the research question, and refine the wording and descriptions of each candidate item for inclusion in the first e-Delphi survey round. A patient representative with previous research experience will also be separately consulted to provide feedback on the list of items. Results of the scoping review will be reported in accordance with the PRISMA-ScR [20] and disseminated via peer-reviewed publication.

Phase 2: e-Delphi survey and consensus meeting

We will conduct a two-round modified electronic Delphi (e-Delphi) survey [21] among researchers with expertise in data processing, to achieve consensus on the most important items for checking and cleaning an IPD dataset. The e-Delphi surveys will be managed using Qualtrics software [22] and pilot tested by steering group members before being launched to the expert panel (see below). Participation involves completion of survey round 1 and/or 2, and consent is implied by completion. Responses will be quasi-anonymous [23], meaning they will be anonymous to all but the lead researcher, who will be able to link responses to individuals via a unique identifier. This enables provision of feedback on previous individual responses in subsequent rounds, which is a core feature of the Delphi method [21]. Each survey round will be open for 4 weeks, and reminder emails will be sent after weeks 1 and 3 to enhance response rates. There will be a break of approximately 2–3 weeks between rounds.

Participants for expert panel

We will invite a representative sample of relevant international stakeholders (e.g. systematic reviewers, statisticians, clinical trialists, data managers and journal editors) with experience in data processing for IPD-MA to participate in the e-Delphi survey rounds. Fluency in English is required since this is the language in which surveys will be conducted. Participants will be recruited using purposive and snowball sampling [21]. Specifically, by contacting corresponding authors of all IPD-MA identified in the scoping review (phase 1), via key contacts of steering group members, and via relevant professional networks and organisations, including the Cochrane Individual Participant Data Meta-analysis (https://methods.cochrane.org/ipdma/) and Prospective Meta-analysis (https://methods.cochrane.org/pma/) Methods Groups. There is no consensus on the ideal sample size for Delphi studies, though the decision should be based on complexity of the topic, the degree and diversity of expertise required and available resources [21]. To enhance generalisability of results and account for attrition, we aim to recruit a broad range of panellists across diverse geographical locations and from a variety of therapeutic areas. Invitations will be sent to publicly available email addresses and will include a Participant Information Sheet and a link to access round 1 of the e-Delphi survey.

e-Delphi survey, round 1

Panel members will each be issued with an anonymous identifier and asked to rate each candidate item by importance using a 7-point Likert scale (1 = ‘not at all important’; 2 = ‘low importance’; 3 = ‘slightly important’; 4 = ‘neutral’; 5 = ‘moderately important’; 6 = ‘very important’; 7 = ‘extremely important’) [24]. This is based on the recommendation that four to seven response categories are optimal for Delphi studies in health research [25]. Each item will be accompanied by a brief description of conceptual logic and rationale for inclusion. If panellists feel they lack the requisite expertise or understanding to appropriately rate an item, they will be instructed to select the option ‘Unable to rate’ (rather than 4 = ‘neutral’, which would dilute results). These will be classed as NA and excluded from the analyses. Panellists will also have the opportunity to suggest any modifications to the items, propose additional items, and elaborate on their responses using free text. Basic demographic information will also be collected to provide an overall profile of panellists and to give an indication of their representativeness, e.g. area of expertise, qualifications, experience, country of residence, gender. On survey close, we will do descriptive analyses of participant characteristics and item rating scores (frequencies, proportions, median, interquartile range) to determine whether consensus has been reached. While there are no recognised guidelines on what constitutes an appropriate level of consensus, we will use the following a priori definition, based on review of the literature and consultation among steering group members [21, 25–27]: Consensus to include an item: ≥75% of the panel rate the item as ‘very important’ or ‘extremely important’. Consensus to exclude an item: ≥75% of the panel rate the item as ‘not at all important’ or ‘low importance’. No consensus: all other combinations. Qualitative data in the form of free text responses will be reviewed and narratively summarised. Any new suggested items or modifications will be discussed among the steering group to determine whether they are unique and relevant for inclusion in round 2.

e-Delphi survey, round 2

Panellists will be invited to participate in the e-Delphi round 2 regardless of whether they completed round 1, to reduce the chance of false consensus, to mitigate attrition, and so that opinions of the original panel are more accurately represented [28]. Since new panellists may join at round 2, all previous items from round 1 will be included in the survey plus any new suggested items. For each item, panellists will be provided with their individual response (where applicable) and the median and interquartile range score for the whole panel from round 1, and given the opportunity to revise and re-rate these items in round 2 according to the same Likert scale [21]. Results will be analysed to assess consensus using the same criteria as for round 1. Attrition bias arising from participant withdrawal between rounds will be explored by comparing response distributions of withdrawn to completing participants [29]. We will also conduct an analysis of whether the chosen items would change if we limited the data to respondents to round 1 only, round 2 only or both round 1 and 2.

Consensus meeting

On completion of the two e-Delphi survey rounds, a consensus meeting will be held virtually among steering group members. This will use a nominal group technique to decide on a final set of checklist items and accompanying explanatory text. The meeting will begin with a brief overview of the study background and aims, followed by a summary of results of the e-Delphi survey rounds. The main focus will then be on discussion of items for which consensus could not be reached, and attendees will have the opportunity to anonymously vote for inclusion or exclusion of each of these in the final checklist. Items for which ≥75% of attendees vote to include will be added to the final checklist. The meeting will be recorded to aid reporting.

Phase 3: Pilot testing

Before pilot testing of the checklist, we will develop an R-markdown template to aid implementation of the checklist. Next, members of the steering group will pilot test the checklist and R-markdown by applying them to IPD-MA they are conducting. We will also invite a convenience sample of systematic reviewers who have experience in conducting IPD-MA to conduct pilot testing. The aim of this phase is to refine the items and explanatory text and to streamline and improve the R-markdown based on participants’ feedback using an iterative process until agreement on the final R-markdown is reached among the steering group.

Data management

All data will be stored in a secure password-protected folder at the NHMRC Clinical Trials Centre, University of Sydney, which is only accessible by authorised research personnel. Data management will be in accordance with the University of Sydney Data Management Policy 2014.

Dissemination

The final checklist and accompanying R-markdown program will be widely disseminated via peer-reviewed, open-access publication, presentations and via relevant social media (e.g. Twitter). We will also use our connections within networks such as Cochrane to encourage rapid integration in international best practice guidelines for systematic reviews and use of the checklist by world-leading organisations in this field.

Ethics

Ethics approval is not required for the phase 1 scoping review, as confirmed by the Chair of The University of Sydney Human Research Ethics Committee. Once the e-Delphi survey has been designed (phase 2) we will seek appropriate ethics approval. This study has been pre-registered on the Open Science Framework.

Discussion

The importance of performing data quality checks and cleaning for IPD-MA is widely recognised, but there is limited existing guidance on what this should entail. We will address this gap by developing a checklist of standard items that can be used by researchers to help ensure the highest quality data are included in their review. This will improve the quality of their review, reduce the risk that the results of meta-analyses will be biased and improve the evidence base for guidelines and clinical practice, ultimately improving health outcomes for the public. Strengths of this study include use of scoping review methodology to develop initial survey items and use of e-Delphi survey methodology, which enables quasi-anonymity of participant responses, global accessibility to facilitate greater sample size and representativeness, streamlined data collection, reduced administrative costs and open sharing of opinions without undue influence from dominant individuals [30, 31]. Provision of controlled feedback between the two survey rounds also allows participants to re-consider their initial ratings, taking into account the views of other stakeholders [21]. Potential limitations include the possibility of a low number of participants and participant attrition. We will mitigate these risks by recruiting as many eligible participants as possible, clearly outlining the requirements of participation from the outset, sending reminder emails and scheduling only 2–3 weeks between rounds to maintain participant engagement [21, 25, 27].

Patient and public involvement

As part of the scoping review, a patient representative with previous research experience will be consulted to provide feedback on the list of items.

PRISMA-P checklist (modified for scoping review).

(DOC) Click here for additional data file. 31 Aug 2022
PONE-D-22-18844
Development of a checklist of standard items for processing individual participant data from randomised trials for meta-analyses: protocol for a modified e-Delphi study PLOS ONE Dear Dr. Hunter, Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points detailed below that were raised during the review process. Please submit your revised manuscript by Oct 14 2022 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file. Please include the following items when submitting your revised manuscript: A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'. A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'. An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'. If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.
If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols. We look forward to receiving your revised manuscript. Kind regards, Matthew Carroll, PhD., MEdL., MPod., BHSc Academic Editor PLOS ONE Journal Requirements: When submitting your revision, we need you to address these additional requirements. 1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf 2. Thank you for stating the following in the Competing Interests section: "I have read the journal's policy and the authors of this manuscript have the following competing interests: KEH receives research funding support via two scholarships administered by the University of Sydney (Postgraduate Research Supplementary Scholarship in Methods Development (SC3504), and Research Training Program Stipend (SC3227)). ALS is co-convenor and KEH & ACW are associate convenors of the Cochrane Prospective Meta-analysis Methods Group. MJP is recipient of the Australian Research Council Discovery Early Career Researcher Award (DE200101618), co-convenor of the Cochrane Bias Methods Group, and President of the Association for Interdisciplinary Meta-research and Open Science. RW is recipient of a National Health and Medical Research Council Investigator Grant. VB is supported by an Alfred Deakin Postdoctoral Research Fellowship. MC is co-convenor (unpaid) of the Cochrane Individual Participant Data Meta-analysis Methods Group; LHMR is coordinator of this group; KEH, PJG, BWM, MC and ALS are members. LHMR is supported by the UK Medical Research Council (https://mrc.ukri.org/) Grant number: MC_UU_00004/06. ALS is recipient of a National Health and Medical Research Council Investigator Grant. BWM is recipient of a National Health and Medical Research Council Investigator grant (GNT1176437), reports consultancy for ObsEva at an hourly rate, reports consultancy for Merck Merck KGaA at an hourly rate and received travel support from Merck Merck KGaA." Please confirm that this does not alter your adherence to all PLOS ONE policies on sharing data and materials, by including the following statement: "This does not alter our adherence to  PLOS ONE policies on sharing data and materials.” (as detailed online in our guide for authors http://journals.plos.org/plosone/s/competing-interests).  If there are restrictions on sharing of data and/or materials, please state these. Please note that we cannot proceed with consideration of your article until this information has been declared. Please include your updated Competing Interests statement in your cover letter; we will change the online submission form on your behalf. 3. Your ethics statement should only appear in the Methods section of your manuscript. If your ethics statement is written in any section besides the Methods, please move it to the Methods section and delete it from any other section. Please ensure that your ethics statement is included in your manuscript, as the ethics statement entered into the online submission form will not be published alongside your manuscript. 4. Please include captions for your Supporting Information files at the end of your manuscript, and update any in-text citations to match accordingly. Please see our Supporting Information guidelines for more information: http://journals.plos.org/plosone/s/supporting-information. 5. Please review your reference list to ensure that it is complete and correct. If you have cited papers that have been retracted, please include the rationale for doing so in the manuscript text, or remove these references and replace them with relevant current references. Any changes to the reference list should be mentioned in the rebuttal letter that accompanies your revised manuscript. If you need to cite a retracted article, indicate the article’s retracted status in the References list and also include a citation and full reference for the retraction notice. [Note: HTML markup is below. Please do not edit.] Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. Does the manuscript provide a valid rationale for the proposed study, with clearly identified and justified research questions? The research question outlined is expected to address a valid academic problem or topic and contribute to the base of knowledge in the field. Reviewer #1: Yes ********** 2. Is the protocol technically sound and planned in a manner that will lead to a meaningful outcome and allow testing the stated hypotheses? The manuscript should describe the methods in sufficient detail to prevent undisclosed flexibility in the experimental procedure or analysis pipeline, including sufficient outcome-neutral conditions (e.g. necessary controls, absence of floor or ceiling effects) to test the proposed hypotheses and a statistical power analysis where applicable. As there may be aspects of the methodology and analysis which can only be refined once the work is undertaken, authors should outline potential assumptions and explicitly describe what aspects of the proposed analyses, if any, are exploratory. Reviewer #1: Yes ********** 3. Is the methodology feasible and described in sufficient detail to allow the work to be replicable? Descriptions of methods and materials in the protocol should be reported in sufficient detail for another researcher to reproduce all experiments and analyses. The protocol should describe the appropriate controls, sample size calculations, and replication needed to ensure that the data are robust and reproducible. Reviewer #1: Yes ********** 4. Have the authors described where all data underlying the findings will be made available when the study is complete? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception, at the time of publication. The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: Yes ********** 5. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #1: Yes ********** 6. Review Comments to the Author Please use the space provided to explain your answers to the questions above and, if applicable, provide comments about issues authors must address before this protocol can be accepted for publication. You may also include additional comments for the author, including concerns about research or publication ethics. You may also provide optional suggestions and comments to authors that they might find helpful in planning their study. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #1: I thank the authors for contributing to the field of research quality and integrity. The authors highlighted the importance of the PRIME-IPD framework but also its limitation and they propose to develop a Delphi-informed checklist and a companion R-markdown program (strength of the project) to guide routine data quality checking and cleaning for IPD-MA of randomised trials. Overall, the manuscript is well-writing and documented. I provided some comments, mostly related to phase 1-scoping review, to help improve the manuscript. -Major Page 6. Please provide the direct link of OSF to the registered protocol instead of the general OSF website so that the reviewers and readers can access the protocol. -Minor Page 7. The following statement “We will use a standardised data extraction form and check any available protocol publications, results publications and/or supplementary materials (e.g., statistical analysis plans, data management plans, PROSPERO registration records) for each included IPD-MA) is a bit confusing by combining the use of a standardised data extraction form and the process of looking for available protocol publications, etc. in the same sentence. For example, I do not know if the search of the PROSPERO database is part of a structured grey literature search or not. I was not able to get the OSF-registered protocol. Page 9. The authors provided example of basic demographic information to be collected for representativeness: if gender is included, I think it’s worth mentioning in the list of examples. Page 12. Patient and public involvement (PPI): I understand the explanation the authors, but I think PPI is necessary and they should try find and include them either in the panelist group or in the steering group. Figures: please improve their resolution ********** 7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #1: No ********** [NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.] While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.
6 Sep 2022 Point-by-point responses: Journal Requirements When submitting your revision, we need you to address these additional requirements. 1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf Response: Thank you for sharing these links. The manuscript has been checked to ensure it meets all of PLOS ONE’s style requirements. 2. Thank you for stating the following in the Competing Interests section: "I have read the journal's policy and the authors of this manuscript have the following competing interests: KEH receives research funding support via two scholarships administered by the University of Sydney (Postgraduate Research Supplementary Scholarship in Methods Development (SC3504), and Research Training Program Stipend (SC3227)). ALS is co-convenor and KEH & ACW are associate convenors of the Cochrane Prospective Meta-analysis Methods Group. MJP is recipient of the Australian Research Council Discovery Early Career Researcher Award (DE200101618), co-convenor of the Cochrane Bias Methods Group, and President of the Association for Interdisciplinary Meta-research and Open Science. RW is recipient of a National Health and Medical Research Council Investigator Grant. VB is supported by an Alfred Deakin Postdoctoral Research Fellowship. MC is co-convenor (unpaid) of the Cochrane Individual Participant Data Meta-analysis Methods Group; LHMR is coordinator of this group; KEH, PJG, BWM, MC and ALS are members. LHMR is supported by the UK Medical Research Council (https://mrc.ukri.org/) Grant number: MC_UU_00004/06. ALS is recipient of a National Health and Medical Research Council Investigator Grant. BWM is recipient of a National Health and Medical Research Council Investigator grant (GNT1176437), reports consultancy for ObsEva at an hourly rate, reports consultancy for Merck Merck KGaA at an hourly rate and received travel support from Merck Merck KGaA." Please confirm that this does not alter your adherence to all PLOS ONE policies on sharing data and materials, by including the following statement: "This does not alter our adherence to PLOS ONE policies on sharing data and materials.” (as detailed online in our guide for authors http://journals.plos.org/plosone/s/competing-interests). If there are restrictions on sharing of data and/or materials, please state these. Please note that we cannot proceed with consideration of your article until this information has been declared. Please include your updated Competing Interests statement in your cover letter; we will change the online submission form on your behalf. Response: I confirm that this does not alter our adherence to all PLOS ONE policies, and have updated the Competing Interests section as required: "I have read the journal's policy and the authors of this manuscript have the following competing interests: KEH receives research funding support via two scholarships administered by the University of Sydney (Postgraduate Research Supplementary Scholarship in Methods Development (SC3504), and Research Training Program Stipend (SC3227)). ALS is co-convenor and KEH & ACW are associate convenors of the Cochrane Prospective Meta-analysis Methods Group. MJP is recipient of the Australian Research Council Discovery Early Career Researcher Award (DE200101618), co-convenor of the Cochrane Bias Methods Group, and President of the Association for Interdisciplinary Meta-research and Open Science. RW is recipient of a National Health and Medical Research Council Investigator Grant. VB is supported by an Alfred Deakin Postdoctoral Research Fellowship. MC is co-convenor (unpaid) of the Cochrane Individual Participant Data Meta-analysis Methods Group; LHMR is coordinator of this group; KEH, PJG, BWM, MC and ALS are members. LHMR is supported by the UK Medical Research Council (https://mrc.ukri.org/) Grant number: MC_UU_00004/06. ALS is recipient of a National Health and Medical Research Council Investigator Grant. BWM is recipient of a National Health and Medical Research Council Investigator grant (GNT1176437), reports consultancy for ObsEva at an hourly rate, reports consultancy for Merck Merck KGaA at an hourly rate and received travel support from Merck Merck KGaA. This does not alter our adherence to PLOS ONE policies on sharing data and materials." 3. Your ethics statement should only appear in the Methods section of your manuscript. If your ethics statement is written in any section besides the Methods, please move it to the Methods section and delete it from any other section. Please ensure that your ethics statement is included in your manuscript, as the ethics statement entered into the online submission form will not be published alongside your manuscript. Response: As requested, the ethics statement has been moved to the Methods section of our manuscript. 4. Please include captions for your Supporting Information files at the end of your manuscript, and update any in-text citations to match accordingly. Please see our Supporting Information guidelines for more information: http://journals.plos.org/plosone/s/supporting-information. Response: A caption for the supporting information file (PRISMA checklist) has been added to the end of our manuscript. It is not cited in-text. 5. Please review your reference list to ensure that it is complete and correct. If you have cited papers that have been retracted, please include the rationale for doing so in the manuscript text, or remove these references and replace them with relevant current references. Any changes to the reference list should be mentioned in the rebuttal letter that accompanies your revised manuscript. If you need to cite a retracted article, indicate the article’s retracted status in the References list and also include a citation and full reference for the retraction notice. Response: The reference list has been reviewed and, to our knowledge, it is complete and correct. No changes have been made and no retracted articles have been cited. Reviewer’s comments to the Author Reviewer #1: I thank the authors for contributing to the field of research quality and integrity. The authors highlighted the importance of the PRIME-IPD framework but also its limitation and they propose to develop a Delphi-informed checklist and a companion R-markdown program (strength of the project) to guide routine data quality checking and cleaning for IPD-MA of randomised trials. Overall, the manuscript is well-writing and documented. Response: Many thanks for this positive feedback. I provided some comments, mostly related to phase 1-scoping review, to help improve the manuscript. -Major Page 6. Please provide the direct link of OSF to the registered protocol instead of the general OSF website so that the reviewers and readers can access the protocol. Response: Thank you for this suggestion. The direct link to the publicly available protocol has been added to the manuscript as follows: “A full protocol for this review, covering each relevant item of the Preferred Reporting Items for Systematic Reviews and Meta-Analyses Extension for Scoping Reviews (PRISMA-ScR) (18), has been preregistered on the Open Science Framework (OSF, https://osf.io/g2unf/).” (p.7) -Minor Page 7. The following statement “We will use a standardised data extraction form and check any available protocol publications, results publications and/or supplementary materials (e.g., statistical analysis plans, data management plans, PROSPERO registration records) for each included IPD-MA) is a bit confusing by combining the use of a standardised data extraction form and the process of looking for available protocol publications, etc. in the same sentence. For example, I do not know if the search of the PROSPERO database is part of a structured grey literature search or not. I was not able to get the OSF-registered protocol. Response: Thank you for this comment. We have re-worded relevant sections of this paragraph for clarity: “We will include systematic reviews with IPD-MA of randomised trials on intervention effects published in English. These have previously been identified up to September 2019 in a systematic review by Wang et al (5), and we will update their search to include all others published up to July 2022… … For each eligible record, we will obtain relevant supplementary materials that are attached or referred to in the publication, e.g. statistical analysis plan, data management plan, PROSPERO registration record... …We will use a standardised data extraction form which will be piloted by five independent reviewers and revised accordingly prior to commencing full extraction.” (p.7-8) Page 9. The authors provided example of basic demographic information to be collected for representativeness: if gender is included, I think it’s worth mentioning in the list of examples. Response: We have added gender as a demographic variable for collection (p.10). Page 12. Patient and public involvement (PPI): I understand the explanation the authors, but I think PPI is necessary and they should try find and include them either in the panelist group or in the steering group. Response: Thank you for this suggestion. We have invited a patient representative to participate in this project. This is articulated in the revised manuscript as follows: “A patient representative with previous research experience will also be separately consulted to provide feedback on the list of items.” (p.9) “Patient and public involvement As part of the scoping review, a patient representative with previous research experience will be consulted to provide feedback on the list of items.” (p.14) Figures: please improve their resolution Response: The resolution of both figures has been improved to 300 dpi. Submitted filename: Response to Reviewers.docx Click here for additional data file. 26 Sep 2022 Development of a checklist of standard items for processing individual participant data from randomised trials for meta-analyses: protocol for a modified e-Delphi study PONE-D-22-18844R1 Dear Dr. Hunter, We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements. Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication. An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/, click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org. If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org. Kind regards, Matthew Carroll, PhD., MEdL., MPod., BHSc Academic Editor PLOS ONE Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. Does the manuscript provide a valid rationale for the proposed study, with clearly identified and justified research questions? The research question outlined is expected to address a valid academic problem or topic and contribute to the base of knowledge in the field. Reviewer #1: Yes ********** 2. Is the protocol technically sound and planned in a manner that will lead to a meaningful outcome and allow testing the stated hypotheses? The manuscript should describe the methods in sufficient detail to prevent undisclosed flexibility in the experimental procedure or analysis pipeline, including sufficient outcome-neutral conditions (e.g. necessary controls, absence of floor or ceiling effects) to test the proposed hypotheses and a statistical power analysis where applicable. As there may be aspects of the methodology and analysis which can only be refined once the work is undertaken, authors should outline potential assumptions and explicitly describe what aspects of the proposed analyses, if any, are exploratory. Reviewer #1: Yes ********** 3. Is the methodology feasible and described in sufficient detail to allow the work to be replicable? Descriptions of methods and materials in the protocol should be reported in sufficient detail for another researcher to reproduce all experiments and analyses. The protocol should describe the appropriate controls, sample size calculations, and replication needed to ensure that the data are robust and reproducible. Reviewer #1: Yes ********** 4. Have the authors described where all data underlying the findings will be made available when the study is complete? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception, at the time of publication. The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: Yes ********** 5. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #1: Yes ********** 6. Review Comments to the Author Please use the space provided to explain your answers to the questions above and, if applicable, provide comments about issues authors must address before this protocol can be accepted for publication. You may also include additional comments for the author, including concerns about research or publication ethics. You may also provide optional suggestions and comments to authors that they might find helpful in planning their study. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #1: The authors have satisfactorily responded to the comments and revised the manuscript accordingly. Thanks. ********** 7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #1: Yes: Amédé Gogovor ********** 30 Sep 2022 PONE-D-22-18844R1 Development of a checklist of standard items for processing individual participant data from randomised trials for meta-analyses: protocol for a modified e-Delphi study Dear Dr. Hunter: I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department. If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org. If we can help with anything else, please email us at plosone@plos.org. Thank you for submitting your work to PLOS ONE and supporting open access. Kind regards, PLOS ONE Editorial Office Staff on behalf of Associate Professor Matthew Carroll Academic Editor PLOS ONE
  15 in total

1.  Understanding systematic reviews and meta-analysis.

Authors:  A K Akobeng
Journal:  Arch Dis Child       Date:  2005-08       Impact factor: 3.791

2.  PRIME-IPD SERIES Part 3. The PRIME-IPD tool fills a gap in guidance for preparing IPD for analysis.

Authors:  Omar Dewidar; Alison Riddle; Elizabeth Ghogomu; Alomgir Hossain; Paul Arora; Zulfiqar A Bhutta; Robert E Black; Simon Cousens; Christine Mathew; Jessica Trawin; Peter Tugwell; Vivian Welch; George A Wells
Journal:  J Clin Epidemiol       Date:  2021-05-15       Impact factor: 6.437

3.  PRIME-IPD SERIES Part 2. Retrieving, checking, and harmonizing data are underappreciated challenges in individual participant data meta-analyses.

Authors:  Brooke Levis; Miriam Hattle; Richard D Riley
Journal:  J Clin Epidemiol       Date:  2021-05-16       Impact factor: 6.437

Review 4.  Methods to assess research misconduct in health-related research: A scoping review.

Authors:  Esmee M Bordewijk; Wentao Li; Rik van Eekelen; Rui Wang; Marian Showell; Ben W Mol; Madelon van Wely
Journal:  J Clin Epidemiol       Date:  2021-05-24       Impact factor: 6.437

Review 5.  The Delphi technique: a worthwhile research approach for nursing?

Authors:  H P McKenna
Journal:  J Adv Nurs       Date:  1994-06       Impact factor: 3.187

6.  Two different invitation approaches for consecutive rounds of a Delphi survey led to comparable final outcome.

Authors:  Anne Boel; Victoria Navarro-Compán; Robert Landewé; Désirée van der Heijde
Journal:  J Clin Epidemiol       Date:  2020-09-28       Impact factor: 6.437

7.  PRISMA Extension for Scoping Reviews (PRISMA-ScR): Checklist and Explanation.

Authors:  Andrea C Tricco; Erin Lillie; Wasifa Zarin; Kelly K O'Brien; Heather Colquhoun; Danielle Levac; David Moher; Micah D J Peters; Tanya Horsley; Laura Weeks; Susanne Hempel; Elie A Akl; Christine Chang; Jessie McGowan; Lesley Stewart; Lisa Hartling; Adrian Aldcroft; Michael G Wilson; Chantelle Garritty; Simon Lewin; Christina M Godfrey; Marilyn T Macdonald; Etienne V Langlois; Karla Soares-Weiser; Jo Moriarty; Tammy Clifford; Özge Tunçalp; Sharon E Straus
Journal:  Ann Intern Med       Date:  2018-09-04       Impact factor: 25.391

8.  Uptake of systematic reviews and meta-analyses based on individual participant data in clinical practice guidelines: descriptive study.

Authors:  Claire L Vale; Larysa H M Rydzewska; Maroeska M Rovers; Jonathan R Emberson; François Gueyffier; Lesley A Stewart
Journal:  BMJ       Date:  2015-03-06

9.  Recruiting and retaining participants in e-Delphi surveys for core outcome set development: Evaluating the COMiT'ID study.

Authors:  Deborah Ann Hall; Harriet Smith; Eithne Heffernan; Kathryn Fackrell
Journal:  PLoS One       Date:  2018-07-30       Impact factor: 3.240

10.  PRIME-IPD SERIES Part 1. The PRIME-IPD tool promoted verification and standardization of study datasets retrieved for IPD meta-analysis.

Authors:  Omar Dewidar; Alison Riddle; Elizabeth Ghogomu; Alomgir Hossain; Paul Arora; Zulfiqar A Bhutta; Robert E Black; Simon Cousens; Michelle F Gaffey; Christine Mathew; Jessica Trawin; Peter Tugwell; Vivian Welch; George A Wells
Journal:  J Clin Epidemiol       Date:  2021-05-24       Impact factor: 6.437

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.