Literature DB >> 31832593

Data verification of nationwide clinical quality registries.

L R van der Werf1,2, S C Voeten1,3, C M M van Loe1, E G Karthaus1,3, M W J M Wouters1,4, H A Prins1.   

Abstract

Background: Clinical auditing is an emerging instrument for quality assessment and improvement. Moreover, clinical registries facilitate medical research as they provide 'real world' data. It is important that entered data are robust and reliable. The aim of this study was to describe the evolving procedure and results of data verification within the Dutch Institute for Clinical Auditing (DICA).
Methods: Data verification performed on several (disease-specific) clinical registries between 2013 and 2015 was evaluated. Sign-up, sample size and process of verification were described. For each procedure, hospitals were visited by external data managers to verify registered data. Outcomes of data verification were completeness and accuracy. An assessment of the quality of data was given per registry, for each participating hospital. Using descriptive statistics, analyses were performed for different sections within the individual registries.
Results: Seven of the 21 registries were verified, involving 174 visits to hospital departments. A step-by-step description of the data verification process was provided. Completeness of data in the registries varied from 97·2 to 99·4 per cent. Accuracy of data ranged from 88·2 to 100 per cent. Most discrepancies were observed for postoperative complications (0·7-7·5 per cent) and ASA classification (8·5-11·4 per cent). Data quality was assessed as 'sufficient' for 145 of the 174 hospital departments (83·3 per cent).
Conclusion: Data verification revealed that the data entered in the observed DICA registries were complete and accurate.
© 2019 The Authors. BJS Open published by John Wiley & Sons Ltd on behalf of BJS Society Ltd.

Entities:  

Mesh:

Year:  2019        PMID: 31832593      PMCID: PMC6887678          DOI: 10.1002/bjs5.50209

Source DB:  PubMed          Journal:  BJS Open        ISSN: 2474-9842


Introduction

Clinical auditing is predominantly an instrument for quality assessment and improvement in healthcare that can help to improve patient outcomes1, 2, 3, 4. Moreover, clinical registries facilitate evidence‐based medical research as they provide ‘real world’ data of patients. In 2009, the nationwide Dutch ColoRectal Audit (DCRA) was initiated by the Association of Surgeons of the Netherlands5. Together with the establishment of other clinical registries, this led to the foundation of the Dutch Institute for Clinical Auditing (DICA) in 20114, 5, 6, 7. Today, 21 clinical registries are facilitated by DICA, and by 2016 more than 500 000 patients had already been registered8. The clinical registries are disease‐specific, and 16 of the 21 registries are surgical registries. In the Netherlands, all hospitals have an obligation to participate in these registries. Annually, a set of hospital‐specific outcomes are published on a public website, although only after approval by the board of each hospital9. These outcomes are used by policy‐makers, health insurance companies and patient federations to assess hospital performance. A prerequisite for using these data for comparison of quality between hospitals is that the entered data are robust and reliable. The validity of entered data is essential, because they are used for medical and epidemiological outcome research. A recent validation study by Cundall‐Curry and colleagues10 emphasized the need for data uploaded to a national registry to be checked. Another validation of data quality in a national registry has been described by Linder et al.11, who showed that the database of the registry contained reliable data. A systematic approach for data verification in nationwide clinical registries has not been described. This study aimed to describe the procedure of data verification used by DICA, as well as the results of each procedure of data verification and the lessons learned from each procedure.

Methods

This was a retrospective descriptive study of data verification in nationwide registries in the Netherlands, a high‐income country in western Europe with approximately 17 million inhabitants. Healthcare insurance is obligatory. Most secondary healthcare is provided in public hospitals. Secondary healthcare was provided in 71 hospitals in 2018. Since 2009, several nationwide registries have been set up by what is now known as DICA. In this study, data verifications performed between 2013 and 2015 were eligible.

Data entry in the registries

Medical professionals have been responsible for the correct registration of their data in the registries. At the start of the DCRA, the majority of surgeons recorded the data themselves. Today, the recording of data is performed by medical specialists, trainees, physician assistants, data managers, research and administrative nurses. The medical specialist remains the final manager responsible for the quality of the data entered. Data are either uploaded in a web‐based system or delivered by the hospitals as a batch, at least once a year but preferably more often to facilitate quality improvements. Hospitals adhere to annual deadlines to deliver all data.

Organizational structure of registries in DICA

Each registry is led by a clinical audit board, consisting of medical professionals mandated by their professional association. The registries also have a scientific committee, comprising representatives of the participating centres. Together with the scientific bureau of DICA, this scientific committee defines valid quality indicators, coordinates outcomes research, and is responsible for the quality of the data.

Procedures to maintain the quality of registered data

In each clinical registry, the reliability of data is improved and verified in four ways. Verification systems are integrated in the web‐based survey, so that the registrar receives direct feedback on erroneous, missing or unlikely data items while entering the data. DICA uses a signalling list that reports erroneous and missing data for all patients in a hospital. Clinical experts receive a weekly updated report with their outcomes for use in clinical auditing. This report also provides the number of registered patients and the completeness of the data, which can help to identify errors early. Finally, external data verification can contribute in determining the reliability of the data.

External data verification

A first pilot project on external data verification was initiated by the Association of Surgeons of the Netherlands in 2014. This led to the formation of a data verification department at DICA that coordinates the procedures of external verification. An independent data verification committee was assigned, which consists of medical experts, a biostatistician, a deputy of the Dutch Health Care Inspectorate and a deputy from the Netherlands Patients Federation. Since the first procedure in 2014, the procedure of external data verification has been optimized based on experience gained during previous procedures. External data verification is done by a third trusted party to guarantee the privacy of patients: Medical Research Data Management (MRDM), Deventer, the Netherlands. MRDM is NEN 7510:2011 and ISO 27001:2013 certified, and complies with privacy regulations in the Netherlands12.

Pilot verification project

In the pilot project, the longest existing registries of DICA, the DCRA and the Dutch Upper Gastrointestinal Cancer Audit, were verified. In these verifications, 18 and 20 variables respectively were verified for all hospitals that participated in the registry. Per hospital, data for 20 patients were verified. With the experiences from the pilot project, the data verification procedure has been modified and was continued for other registries.

Regular data verifications

Patient and variable selection for verification

The scientific committee sets selection criteria for the types of patient that should be included in the data verification, and selects the variables to be verified.

Sign‐up

Data verification was performed for each registry individually. All hospitals participating in the registry received an e‐mail invitation to participate. In the invitation letter the procedure, practical requirements and privacy of data verification were explained. Participation in data verification was voluntary and free of costs for the hospitals, although results were reported to the National Health Care Institute (Zorginstituut Nederland), which is responsible for public transparency of hospital‐specific quality information in the Netherlands.

Sample sizes

As previous studies were lacking, sample sizes were set arbitrarily in a consensus meeting with the data verification committee, which included a biostatistician. The preferred number of hospitals to verify for each registry was set at 15. The number of patients to verify in each hospital was based on a percentage of the annual hospital volume or a set number of patients, with a minimum of 30 patients.

The process

The process of data verification in hospitals was done manually by trained employees. They were all trained by DICA in both the verification procedure and the medical content. For each hospital, the completeness of the registration was evaluated, and the accuracy of data assessed.

Completeness of the registry

For the verification, the data set of a complete registration year was used. This data set was used for clinical auditing, to calculate the quality indicators for each hospital. To verify the completeness of the registry, hospitals were asked to provide a patient list derived from their administrative system. A sample of the list was compared with patients registered in the registry. Patients who were on the patient list but missing from the registry were registered as ‘absent’. Different types of patient list were used. In the first verified registries, a patient list derived from the nationwide network and registry of histopathology and cytopathology in the Netherlands (PALGA network13) or a patient list with specific diagnosis–treatment combination (DBC) codes, as recorded by the hospital administration and insurance companies, was used. These DBC codes are used in the Netherlands for reimbursement of all costs of delivered care and are comparable with ICD codes. Not all methods mentioned above proved to be applicable for every hospital because the PALGA system was not used in all hospitals and in some cases the DBC codes could differ between hospitals. Therefore it was decided that, for the studied verifications, hospitals could choose the type of patient list that fitted the aim of data verification and matched their system.

Accuracy of the data

To assess the accuracy of the data, the original data derived from the electronic patient records were compared with the data in the registry. For the hospitals, it was not possible to revise these data before data verification. To register the accuracy of data, a web‐based survey was used in which the selected items to be verified were prefilled, based on the registered data. Each variable was assessed as ‘not discrepant’ or ‘discrepant’; missing data were assessed as ‘discrepant’. When discrepancies were observed, the correct information from the source data and an additional explanation of discrepancy had to be noted. As a minimum, the variables needed to calculate two of the quality indicators were verified in all registries, including ‘the percentage of patients with severe complications’ and ‘the percentage of patients who died within 30 days after surgery’. For ‘severe complications’, different definitions were used among registries. Mostly, the definition was ‘complications leading to a prolonged hospital stay, a reintervention or death’. Another reason to verify a variable was the use of a variable in the case‐mix correction of outcome indicators, the ASA score, which is a scale of the preoperative fitness of patients5.

Analysis of the data verification and results

In the process of analysing the data, an assessment of the observed discrepancies was done by an independent data manager and a medical researcher from DICA. Data for different hospitals were analysed separately. Completeness and accuracy of the data were assessed with descriptive statistics for different sections within the registries. Analyses were performed using IBM SPSS® version 23.0 (IBM, Armonk, New York, USA). After evaluation of the discrepancies for each hospital by the data manager and medical researcher, the results of this evaluation were reported to the hospitals. In an adversarial process, it was possible for each hospital to give a response to the detected discrepancies. The independent verification committee had the final say. A composite measure was defined for the conclusion of ‘sufficient quality’ or ‘insufficient quality’. Table 1 shows the criteria for the conclusion of ‘insufficient quality’ for one of the procedures. For some other procedures, small adjustments in thresholds were made due to a low number of patients or events.
Table 1

Factors leading to the label ‘insufficient quality’

FactorDescription
CompletenessOf all patients who met the inclusion criteria, more than 2 per cent (at least 2 patients) were not registered
MortalityOf all patients who met the inclusion criteria, one or more patients died but were not registered at all or were not registered as ‘death’
ComplicationOf all patients who had a complication, the complication was not registered in more than 5 per cent (at least 3 patients)
ReinterventionOf all patients who had a reintervention, the reintervention was not registered in more than 5 per cent (at least 3 patients)
ReadmissionOf all patients who had a readmission, the readmission was not registered in more than 5 per cent (at least 3 patients)
Factors leading to the label ‘insufficient quality’ The conclusion regarding the quality of the data and an anonymous summary report were communicated to the hospitals, to help them learn from the discrepancies and optimize their registration procedure. The results were also reported to the National Health Care Institute.

Results

Since 2014, seven of the 21 registries have been verified individually. Information about the different verifications is shown in Tables 2 and 3.
Table 2

Characteristics and results of pilot verifications in 2013

DCRA pilotDUCA pilot
Registry year of verification 20132013
Validation
Variables verified2018
Hospitals that signed up 77 (88)28 (88)
Hospitals verified7728
Patients verified per hospital2020
Completeness
Missing patients 271 of 9679 (2·8)10 of 1251 (0·8)
Missed deaths241
Missed patients with severe complications552
Accuracy
Total no. of patients in sample1570560
Discrepant deaths5 (0·3)0 (0)
Discrepant complications117 (7·5)17 (3·0)
Discrepant reinterventions29 (1·8)9 (1·6)
Discrepant ASA score134 (8·5)64 (11·4)
Discrepant radicality4 of 415 (1·0)11 of 235 (4·7)
Objections
No. of hospitals2216

Values in parentheses are percentages.

*Sign‐up for the Dutch ColoRectal Audit (DCRA) and the Dutch Upper Gastrointestinal Cancer Audit (DUCA) was done together.

Verification of completeness for DCRA and DUCA was done for all registered patients.

Table 3

Characteristics and results of verifications in 2014 and 2015

Dutch Lung Cancer AuditDutch Audit for Carotid InterventionsDutch Surgical Aneurysm AuditDutch Audit for Treatment of ObesityDutch Pancreatic Cancer Audit
Registry year of verification 20142015201520152015
Validation
Variables verified1769913
Hospitals that signed up29 of 43 (67)36 of 53 (68)39 of 60 (65)12 of 20 (60)12 of 19 (63)
Hospitals verified1513141212
Completeness
Patients verified per hospital± 26± 22± 21± 35± 30
Missing patients* 5 of 830 (0·6)2 of 286 (0·7)5 of 294 (1·7)5 of 417 (1·2)2 of 333 (0·6)
Missed deaths00000
Missed patients with severe complications31211
Accuracy
Total no. of patients in sample388281298420358
Discrepant deaths0 (0)2 (0·7)0 (0)0 (0)0
Discrepant complications13 (4·6)22 (7·4)
Discrepant severe complications216 of 6596 (3·3) 3 (0·7)18 (5·0)
Discrepant reinterventions0 (0)0 (0)3 (0·7)
Discrepant readmissions6 (2·0)
Objections
No. of hospitals67

Values in parentheses are percentages. Some cells are empty because the information was not available.

Verification of completeness for these registries was done for all patients in the sample.

Percentage calculated as the proportion of discrepant registrations of the total complications that could be registered for patients in the sample.

Characteristics and results of pilot verifications in 2013 Values in parentheses are percentages. *Sign‐up for the Dutch ColoRectal Audit (DCRA) and the Dutch Upper Gastrointestinal Cancer Audit (DUCA) was done together. Verification of completeness for DCRA and DUCA was done for all registered patients. Characteristics and results of verifications in 2014 and 2015 Values in parentheses are percentages. Some cells are empty because the information was not available. Verification of completeness for these registries was done for all patients in the sample. Percentage calculated as the proportion of discrepant registrations of the total complications that could be registered for patients in the sample. In the pilot procedure, for all hospitals that signed up (77 in DCRA and 28 in DUCA) 18–20 variables and all patients eligible in 2013 were verified. This procedure was found to be very time‐consuming, logistically challenging and financially unfavourable. Therefore, for subsequent verifications a more limited set of variables was used. To limit the number of hospitals, 15 hospitals per registry was set; these hospitals were selected randomly by the third trusted party, MRDM.

Regular verification project

The verified variables that were chosen differed between registries; all verified variables are shown in Tables 2 and 3. In the included seven data verification procedures, the percentage of hospitals signed up for verification varied between registries from 60 to 88 per cent. In two verifications, some hospitals withdrew their sign‐up after selection because they were not able to comply with the conditions for verification (no time and priority for preparation). In two other verifications, fewer than 15 hospitals signed up. In 2015, an online survey was undertaken to investigate the reasons for refraining from signing up. The commonest reasons included that centres would have signed up but had forgotten, were too late or miscommunicated (8 of 21 answers), lack of time (4 of 21), and disagreed or did not comply with the legality of the procedure of verification (4 of 21).

Sample size

The number of patient records that were verified varied per registry, from 281 to 1570 (median 388). The percentage of unregistered patients varied from 0·6 to 2·8 per cent between registries. Details of these ‘missing patients’ are shown in Tables 2 and 3. Most discrepancies were observed in postoperative complications and ASA score (Tables 2 and 3). In 3·0–7·5 per cent of the total number of patients in the sample, registration of postoperative complications was discrepant, either wrongly registered or not registered. In 8·5–11·4 per cent of the total number of patients, an incorrect ASA score was registered or missing.

Results of the procedures

In 29 of 174 data verification processes performed, the quality of data was assessed as ‘insufficient’ according to the criteria. The number of hospitals that responded to the results or lodged an objection ranged from 6 to 22 per registry (Tables 2 and 3).

Lessons learned from the results of each verification

An overview of the derived lessons is shown in Table 4. As concluded from discussions with the registrars, the most common discrepancies in the verifications seemed to be caused by unclear definitions and descriptions of variables. This was seen in six of seven verifications. The variables with the most discrepancies included the M status of the tumour, ASA score, the urgency of surgery, intraoperative complications, postoperative complications, reinterventions, and the number of days in the ICU. Incorrect inclusion and incorrect exclusion of patients in the registries were also observed.
Table 4

Lessons learned from the verifications

Dutch Surgical Colorectal Audit (pilot)Dutch Upper Gastrointestinal Cancer Audit (pilot)Dutch Lung Cancer AuditDutch Audit for Carotid InterventionsDutch Surgical Aneurysm AuditDutch Audit for Treatment of ObesityDutch Pancreatic Cancer Audit
Lessons derived for the procedure of data verification
More extensive training for verification employees neededx
Patient list not suitablexxxxx
Selection of hospitals: too many hospitals verifiedxx
Selection of variables: too many variables verifiedxx
Time‐consuming to evaluate completeness for all patients rather than a samplex
Selection of patients: too few patients verifiedxxx
Privacy of patient records during the procedure was complexxx
Criteria for ‘sufficient/insufficient’ need to be set before start of data verificationxx
Criteria for ‘sufficient/insufficient’ need to be changedxxx
Criteria for ‘sufficient/insufficient’ are without nuancexxxxx
Data verification has to become a continuous process in the audit cyclexxxxxxx
Lessons derived for registrars
Need to fill in all variables, also when not requiredx
Complications need to be registered more preciselyxxxxx
ASA score needs to be registered as described in the anaesthesia reportxx
Date of surgery has to be registered more preciselyxx
Date of discharge has to be registered more preciselyx
Hospitals must adhere to inclusion and exclusion criteriax
Lessons derived for the audits x
Need for clear definitions of variablesxxxxxx
Error in data structure discoveredx
Lessons learned from the verifications

Discussion

This study showed that verification of the completeness and accuracy of the registry is essential. The strength of the described process is that a dedicated team within the audit organization initiates and coordinates nationwide data verifications of the registries. By learning from every verification, the process of verification was improved continuously. Data verification may help to improve the survey of the registries and thereby contribute to higher quality data sets. The most important lesson derived from the verification is the need for clear definitions of variables. In the first verification procedures, many of the missing patients had severe complications or had died. These discrepancies may have happened because hospitals were afraid to be criticized if they registered all patients with complications. Another explanation might be that hospitals were not capable of following some of their patients with complications, as these patients are often treated on different wards (such as the ICU) or even transferred to another hospital. Because the registry is used to compare hospitals, it is imperative that all hospitals have a complete registry. Verification of data completeness may stimulate hospitals to adhere to the proposed rules of data entry. The verification of data accuracy is also important. One of the requirements for accurate data is the use of clear definitions for multi‐interpretable variables. Many discrepancies, however, were seen for simple, uni‐interpretable variables, such as date of surgery and date of discharge. Because length of stay and waiting times are frequently used as quality indicators, these results indicate that simple variables should also be verified. By detecting common discrepancies, such as those resulting from unclear description of items, the survey could be improved by the clarification of definitions, to prevent incorrect data in the future. Furthermore, by reporting erroneous data, registrars in hospitals can learn lessons and improve the registrations. A side‐effect of integrated data verification in the cycle of clinical auditing might be that it stimulates hospitals to register correctly, because they know their data will be verified. This so‐called Hawthorne effect describes improved results that might result from increased awareness for an outcome, in this situation the collection of correct data14. All of these mechanisms could benefit the quality of the data sets and may lead to more valid registries and more reliable data for outcome research. Valid registries are important because the results of quality indicators are publicly available for policy‐makers, health insurance companies and patient federations. The described process also has limitations, which could be improved upon. Hospitals that might intentionally register incorrectly or incompletely were not identified by the present procedure because signing up for data verification was voluntary. Hospitals can influence their published results by intentionally registering incorrect or incomplete data. This might be a problem because the results are used for clinical auditing and comparisons between hospitals. A counter‐argument for making verification mandatory is that some medical specialists already feel criticized by clinical auditing as it takes some time. Forcing them to have data verification may create resistance in the field. For the integrity of verification, however, it is desirable that the National Health Care Institute (Zorginstituut Nederland) declares the process of data verification mandatory. Another possibility could be that details on sign‐up and participation in data verification become publicly transparent, and could be used to assess the validity of indicator results for individual hospitals. Another limitation in the present procedure is the struggle to verify the completeness of the registry. At present, hospitals are free to choose which patient list they provide. A frequently used patient list is one extracted from the electronic patient record system. This strategy is not protected against flaws, because this list could be the same as that used to select patients for registration. A further disadvantage of this system is that hospitals could manipulate the patient list if they wanted to ‘hide’ patients with severe complications. The results of the verifications, however, showed that use of these self‐provided lists succeeded in identifying unregistered patients. To improve registries further and provide valuable, verified, benchmark data to all parties involved, DICA aims to develop a system in which data verification becomes a continuous process, as part of the registry. For this purpose, data verification is included in the annual budget. This year will be the first in which data verification will be repeated in two registries that have been verified previously, 3 years ago. Regarding the optimal sample size for verification, difficulties in finding a balance between the cost aspect and certainty of the verification were experienced in the past. In the near future, a pilot will be started to verify clinical outcome registry data in a more automated process. This pilot aims to select patients with high risk of discrepancy15. The hypothesis is that verification of these high‐risk patients will lead to a higher sensitivity for discrepancies when the same sample size is used as in the present procedure. As sample size directly influences costs, this procedure will be more cost‐effective. This pilot is to be funded by Stichting Kwaliteitsgelden Medisch Specialisten (SKMS), a Dutch foundation with a policy of improving quality for medical specialists, and which is part of the Dutch Federation of Medical Specialists. For most verifications, the absence of clear and uniform definitions of items led to the most discrepancies. DICA will make an important improvement by creating uniform, clear and correct definitions for items in all registries. Recently, a project was launched for this purpose. In this project, as many items as possible will be defined equally in all registries, with an attempt to use existing guidelines, classifications and definitions, such as the definitions used in SNOMED Clinical Terms and ICD‐10 codes. SKMS also supports this project. It is expected that registration of data will become increasingly automated in the near future. The authors envisage that correct data from electronic patient records will be uploaded automatically to the registry without the use of data managers.

Collaborators

R. Veenstra (Medical Research Data Management, Deventer, the Netherlands) is a collaborating author.
  9 in total

1.  Gaming in risk-adjusted mortality rates: effect of misclassification of risk factors in the benchmarking of cardiac surgery risk-adjusted mortality rates.

Authors:  Sabrina Siregar; Rolf H H Groenwold; Michel I M Versteegh; Luc Noyez; Willem Jan P P ter Burg; Michiel L Bots; Yolanda van der Graaf; Lex A van Herwerden
Journal:  J Thorac Cardiovasc Surg       Date:  2012-04-14       Impact factor: 5.209

2.  Early outcomes from the Dutch Upper Gastrointestinal Cancer Audit.

Authors:  L A D Busweiler; B P L Wijnhoven; M I van Berge Henegouwen; D Henneman; N C T van Grieken; M W J M Wouters; R van Hillegersberg; J W van Sandick
Journal:  Br J Surg       Date:  2016-10-05       Impact factor: 6.939

3.  Clinical auditing as an instrument for quality improvement in breast cancer care in the Netherlands: The national NABON Breast Cancer Audit.

Authors:  Annelotte C M van Bommel; Pauline E R Spronk; Marie-Jeanne T F D Vrancken Peeters; Agnes Jager; Marc Lobbes; John H Maduro; Marc A M Mureau; Kay Schreuder; Carolien H Smorenburg; Janneke Verloop; Pieter J Westenend; Michel W J M Wouters; Sabine Siesling; Vivianne C G Tjan-Heijnen; Thijs van Dalen
Journal:  J Surg Oncol       Date:  2016-11-25       Impact factor: 3.454

4.  Data errors in the National Hip Fracture Database: a local validation study.

Authors:  D J Cundall-Curry; J E Lawrence; D M Fountain; C R Gooding
Journal:  Bone Joint J       Date:  2016-10       Impact factor: 5.082

5.  Use of continuous quality improvement to increase use of process measures in patients undergoing coronary artery bypass graft surgery: a randomized controlled trial.

Authors:  T Bruce Ferguson; Eric D Peterson; Laura P Coombs; Mary C Eiken; Meghan L Carey; Frederick L Grover; Elizabeth R DeLong
Journal:  JAMA       Date:  2003-07-02       Impact factor: 56.272

6.  Validation of data quality in the Swedish National Register for Oesophageal and Gastric Cancer.

Authors:  G Linder; M Lindblad; P Djerf; P Elbe; J Johansson; L Lundell; J Hedberg
Journal:  Br J Surg       Date:  2016-07-28       Impact factor: 6.939

7.  Implementing the National Hip Fracture Database: An audit of care.

Authors:  Nirav K Patel; Khaled M Sarraf; Sarah Joseph; Chooi Lee; Fiona R Middleton
Journal:  Injury       Date:  2013-05-13       Impact factor: 2.586

8.  The Dutch surgical colorectal audit.

Authors:  N J Van Leersum; H S Snijders; D Henneman; N E Kolfschoten; G A Gooiker; M G ten Berge; E H Eddes; M W J M Wouters; R A E M Tollenaar; W A Bemelman; R M van Dam; M A Elferink; Th M Karsten; J H J M van Krieken; V E P P Lemmens; H J T Rutten; E R Manusama; C J H van de Velde; W J H J Meijerink; Th Wiggers; E van der Harst; J W T Dekker; D Boerma
Journal:  Eur J Surg Oncol       Date:  2013-07-18       Impact factor: 4.424

9.  Evaluating national practice of preoperative radiotherapy for rectal cancer based on clinical auditing.

Authors:  N J van Leersum; H S Snijders; M W J M Wouters; D Henneman; C A M Marijnen; H R Rutten; R A E M Tollenaar; P J Tanis
Journal:  Eur J Surg Oncol       Date:  2013-06-28       Impact factor: 4.424

  9 in total
  10 in total

1.  Implementation of Minimally Invasive Esophagectomy From a Randomized Controlled Trial Setting to National Practice.

Authors:  Sheraz R Markar; Melody Ni; Suzanne S Gisbertz; Leonie van der Werf; Jennifer Straatman; Donald van der Peet; Miguel A Cuesta; George B Hanna; Mark I van Berge Henegouwen
Journal:  J Clin Oncol       Date:  2020-05-18       Impact factor: 44.544

2.  Medication Use and Clinical Outcomes by the Dutch Institute for Clinical Auditing Medicines Program: Quantitative Analysis.

Authors:  Rawa Kamaran Ismail; Jesper van Breeschoten; Silvia van der Flier; Caspar van Loosen; Anna Maria Gerdina Pasmooij; Maaike van Dartel; Alfons van den Eertwegh; Anthonius de Boer; Michel Wouters; Doranne Hilarius
Journal:  J Med Internet Res       Date:  2022-06-23       Impact factor: 7.076

3.  Comparison of short-term outcomes from the International Oesophago-Gastric Anastomosis Audit (OGAA), the Esophagectomy Complications Consensus Group (ECCG), and the Dutch Upper Gastrointestinal Cancer Audit (DUCA).

Authors: 
Journal:  BJS Open       Date:  2021-05-07

4.  Comparison of short-term outcomes from the International Oesophago-Gastric Anastomosis Audit (OGAA), the Esophagectomy Complications Consensus Group (ECCG), and the Dutch Upper Gastrointestinal Cancer Audit (DUCA).

Authors: 
Journal:  BJS Open       Date:  2021-05-07

5.  Predictors of 30-Day Mortality Among Dutch Patients Undergoing Colorectal Cancer Surgery, 2011-2016.

Authors:  Tom van den Bosch; Anne-Loes K Warps; Michael P M de Nerée Tot Babberich; Christina Stamm; Bart F Geerts; Louis Vermeulen; Michel W J M Wouters; Jan Willem T Dekker; Rob A E M Tollenaar; Pieter J Tanis; Daniël M Miedema
Journal:  JAMA Netw Open       Date:  2021-04-01

6.  Openly accessed and openly published: a celebration of international high-impact surgical research.

Authors:  Ville Sallinen; Katy Darvall; Laura Lorenzon; Frank McDermott; Giovanni Marchegiani
Journal:  BJS Open       Date:  2021-09-06

7.  Short-term postoperative outcomes of gastric adenocarcinoma patients treated with curative intent in low-volume centers.

Authors:  Francisco-Javier Lacueva; Javier Escrig-Sos; Roberto Marti-Obiol; Carmen Zaragoza; Fernando Mingol; Miguel Oviedo; Nuria Peris; Joaquin Civera; Amparo Roig
Journal:  World J Surg Oncol       Date:  2022-10-17       Impact factor: 3.253

8.  Preoperative imaging for colorectal liver metastases: a nationwide population-based study.

Authors:  A K E Elfrink; M Pool; L R van der Werf; E Marra; M C Burgmans; M R Meijerink; M den Dulk; P B van den Boezem; W W Te Riele; G A Patijn; M W J M Wouters; W K G Leclercq; M S L Liem; P D Gobardhan; C I Buis; K F D Kuhlmann; C Verhoef; M G Besselink; D J Grünhagen; J M Klaase; N F M Kok
Journal:  BJS Open       Date:  2020-05-06

9.  Length of hospital stay after uncomplicated esophagectomy. Hospital variation shows room for nationwide improvement.

Authors:  Daan M Voeten; Leonie R van der Werf; Johanna W van Sandick; Richard van Hillegersberg; Mark I van Berge Henegouwen
Journal:  Surg Endosc       Date:  2020-10-26       Impact factor: 4.584

10.  Sex differences in tumor characteristics, treatment, and outcomes of gastric and esophageal cancer surgery: nationwide cohort data from the Dutch Upper GI Cancer Audit.

Authors:  Marianne C Kalff; Anna D Wagner; Rob H A Verhoeven; Valery E P P Lemmens; Hanneke W M van Laarhoven; Suzanne S Gisbertz; Mark I van Berge Henegouwen
Journal:  Gastric Cancer       Date:  2021-08-07       Impact factor: 7.370

  10 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.