Literature DB >> 34712992

The Quality and Content of Internet-Based Information on Orthopaedic Sports Medicine Requires Improvement: A Systematic Review.

Ilona Schwarz1, Darby A Houck1, John W Belk1, Jack Hop1, Jonathan T Bravman1, Eric McCarty1.   

Abstract

PURPOSE: To evaluate the quality and content of internet-based information available for some of the most common orthopaedic sports medicine terms.
METHODS: A search of the PubMed, Embase, and Cochrane databases following PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-analyses) guidelines was performed. All English-language literature published from 2010 to 2020 discussing information quality pertaining to orthopaedic sports medicine terms was included. Outcomes included the search engines used, number and type of websites evaluated, platform, and quality scoring metrics. Descriptive statistics are presented.
RESULTS: This review includes 21 studies. Of these, 3 evaluated both the upper and lower extremity. Twelve focused on either the upper or lower extremity, most commonly rotator cuff tears (3 of 12) and/or anterior cruciate ligament pathologies (7 of 12). The most common engines were Google (18 of 21), Bing (16 of 21), Yahoo (16 of 21), YouTube (3 of 21), Ask (3 of 21), and AOL (2 of 21). The average number of media files assessed per study was 87 ± 55. Website quality was assessed with DISCERN (7 of 21), Flesch-Kincaid (9 of 21), Health on the Net (7 of 21), and/or Journal of the American Medical Association Benchmark (7 of 21) scores. YouTube was evaluated with Journal of the American Medical Association Benchmark scores (1.74 ± 1.00). Image quality was reported in 2 studies and varied with search terminology.
CONCLUSIONS: The results of this systematic review suggest that physicians should improve the quality of online information and encourage patients to access credible sources when conducting their own research. CLINICAL RELEVANCE: Doctors can and should play an active role in closing the gap between the level of health literacy of their patients and that of most common online resources.
© 2021 by the Arthroscopy Association of North America. Published by Elsevier Inc.

Entities:  

Year:  2021        PMID: 34712992      PMCID: PMC8527260          DOI: 10.1016/j.asmr.2021.05.007

Source DB:  PubMed          Journal:  Arthrosc Sports Med Rehabil        ISSN: 2666-061X


Patients have immediate access to powerful search engines and often use the internet to obtain inexpensive, quick medical advice. Previous studies have evaluated the reliability of public-access websites and have reported that many lack high-quality, accurate information. A unique subset of patients who have yet to be investigated in this context is orthopaedic athletes. Surgical interventions often have recovery periods that impact quality of life—especially in an active population in which an injury results in a significant decrease in daily activity. It is common for the surgeon to encourage limited use of an injured area or even complete immobilization to promote healing. Many active individuals facing such downtime turn to the internet since it is a wealth of information that is easy to access. The purpose of this study was to evaluate the quality and content of internet-based information available for some of the most common orthopaedic sports medicine terms. We hypothesized that websites with a Health on the Net (HON) seal or those authored by academic institutions would provide the most medically accurate, safe, and pertinent information whereas websites published by individuals or for-profit businesses would provide the least.

Methods

Two independent reviewers (D.A.H. and J.W.B.) searched PubMed, Embase, and the Cochrane Library up to June 12, 2020. The following search terms were used: (internet information quality) AND (anterior cruciate ligament) or (meniscal) or (shoulder instability) or (Bankart) or (rotator cuff) or (shoulder) or (tennis elbow) or (lateral epicondylitis) or (medial collateral ligament) or (posterior cruciate ligament) or (osteochondral defect) or (cartilage defect) or clavicle or knee. A total of 324 records were identified through the search of the 3 databases. Preliminary searches were reviewed by title and/or abstract to determine study eligibility based on the inclusion criteria: studies discussing searching internet information quality pertaining to common sports medicine orthopaedic topics including anterior cruciate ligament (ACL) rupture, medial collateral ligament (MCL) tear, posterior cruciate ligament tear, meniscal tear, osteochondral defect of the knee (cartilage defect of the knee), shoulder labral tear (Bankart tear), rotator cuff tear, shoulder arthritis, clavicle fracture, and/or lateral epicondylitis (tennis elbow); full-text studies published in the English language; studies of Level I to IV evidence; and studies published from 2010 to 2020. Studies were included if they discussed searching at least one of the following databases: Google (Google LLC, Mountain View, CA), Yahoo (Verizon Media, New York City, NY), YouTube (Google LLC), Ask (IAC Search and Media, Oakland, CA), AOL (Verizon Media, New York City, NY), and/or Bing (Microsoft Corporation, Richmond, WA). Non–English-language studies, studies for which the full text was not available, cadaveric studies, basic science articles, case reports, personal correspondence, studies that did not evaluate search engines or consider a medical problem, studies that were not related to orthopaedic sports medicine, and personal correspondence were excluded. Twenty-one studies met the inclusion and exclusion criteria (Fig 1). Data extraction from each study was performed independently (I.S.). Disclosure of funding and third-party involvement were not required to obtain any of the collected data.
Fig 1

PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-analyses) flow diagram showing study search and screening results.

PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-analyses) flow diagram showing study search and screening results.

Reporting Outcomes

The outcomes extracted included the primary search engines used, the number of websites evaluated by each study, the type or classification of the websites, the primary platform of the search (websites/Web pages, videos, or images), and the metrics used to score the websites. Scoring systems included the following: DISCERN instrument, Flesch-Kincaid (FK) tool,5, 6, 7 Journal of the American Medical Association (JAMA) Benchmark scores, and/or HON foundational principles.8, 9, 10 The DISCERN instrument consists of 15 separate questions aimed at a specific quality criterion plus an overall quality rating., The DISCERN categories include reliability, treatment choices, and overall quality. The FK tool is the most widely used measure of reading ease. The tool has 2 parts: reading ease and grade level. The first number in a score indicates reading ease (0-100). The second number indicates the average reading grade level. The national average reading level is an eighth-grade level. The recommended published reading level for the layperson is a sixth-grade level. Both reading ease and the grade level are calculated using the same set of metrics: word length and sentence length. Reading ease and grade level are inversely related—a higher reading ease level correlates to a lower grade level. (Formulas are available in Appendix 1.) The HON seal is granted based on 6 core principles: quality, confidentiality, neutrality, transparency, community, and visibility.8, 9, 10 The JAMA Benchmark score ranges from 0 to 4 points. The 4 criteria include author description, references, dating, and disclosures. One point is given for each of the aforementioned aspects; a score of 3 or greater is considered “high quality.”

Statistical Analyses

Descriptive data are presented. Owing to the heterogeneity among studies, no calculable data or meta-analyses are presented in this review.

Results

This systematic review was conducted based on the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-analyses) checklist and guidelines.

Included Studies

A total of 324 records were identified through the search of the 3 databases. Of these studies, 21 met the inclusion criteria.,11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30 Table 1 details the 21 included studies published between 2010 and 2020 that met the inclusion criteria.,12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31 All of the studies were descriptive or evaluative given the nature of the topic being discussed (Table 1).
Table 1

Summary of Included Studies

Study AuthorsStudy TitleJournalCountryLevel of Evidence
Akinleye et al.,12 2018“Readability of the Most Commonly Accessed Arthroscopy-Related Online Patient Education Materials”ArthroscopyUnited StatesLevel VI: descriptive study
Akpolat and Kurdal,13 2020“Is Quality of YouTube Content on Bankart Lesion and Its Surgical Treatment Adequate?”Orthopedic Surgery and ResearchTurkeyLevel VI: descriptive study
Bruce-Brand et al.,14 2013“Assessment of the Quality and Content of Information on Anterior Cruciate Ligament Reconstruction on the Internet”ArthroscopyIrelandLevel VI: descriptive study
Cassidy et al.,15 2018“YouTube Provides Poor Information Regarding Anterior Cruciate Ligament Injury and Reconstruction”Knee Surgery, Sports Traumatology, ArthroscopyIrelandLevel VI: descriptive study
Celik et al.,16 2020“Assessment of the Quality and Reliability of the Information on Rotator Cuff Repair on YouTube”Orthopaedics & Traumatology, Surgery & ResearchTurkeyLevel VI: descriptive study; case series
Dalton et al.,17 2015“Availability of Accessible and High-Quality Information on the Internet for Patients Regarding the Diagnosis and Management of Rotator Cuff Tears”Journal of Shoulder and Elbow SurgeryIrelandLevel VI: descriptive study
DeFroda et al.,18 2019“Accuracy of Internet Images of Ligamentous Knee Injuries”The Physician and SportsmedicineUnited StatesLevel VI: descriptive study
DeFroda et al.,19 2018“Internet Accuracy of Publicly Available Images of Meniscal Tears”The Physician and SportsmedicineUnited StatesLevel VI: descriptive study
Devitt et al.,20 2017“Comparison of the Source and Quality of Information on the Internet Between Anterolateral Ligament Reconstruction and Anterior Cruciate Ligament Reconstruction: An Australian Experience”Orthopaedic Journal of Sports MedicineAustraliaLevel VI: descriptive study; cross sectional
Duncan et al.,21 2013“Evaluation of Information Available on the Internet Regarding Anterior Cruciate Ligament Reconstruction”ArthroscopyUnited StatesLevel VI: descriptive study
Dy et al.,5 2012“Does the Quality, Accuracy, and Readability of Information About Lateral Epicondylitis on the Internet Vary With the Search Term Used?”Hand (New York, NY)United StatesLevel VI: descriptive study
Garcia et al.,22 2014“Online Resources for Shoulder Instability: What Are Patients Reading?”Journal of Bone and Joint Surgery—American volumeUnited StatesLevel VI: descriptive study
Goldenberg et al.,23 2019“Online Resources for Rotator Cuff Repair: What are Patients Reading?”Arthroscopy, Sports Medicine, and RehabilitationUnited StatesLevel VI: descriptive study
Gosselin et al.,24 2013“Examining Internet Resources on Gender Differences in ACL Injuries: What Patients are Reading”The KneeUnited StatesLevel VI: descriptive study
Houck et al.,25 2019“Evaluation of Information Available on the Internet Regarding Reverse Total Shoulder Arthroplasty”Shoulder & ElbowUnited StatesLevel VI: descriptive study
Nwachukwu et al.,26 2018“The Quality of Online Resources Available to Patients Interested in Knee Biologic Therapies Is Poor”HSS Journal: The Musculoskeletal Journal of Hospital for Special Surgery.United StatesLevel VI: descriptive study
O'Neill et al.,27 2014“An Assessment of the Readability and Quality of Elective Orthopaedic Information on the Internet”Acta Orthopaedica BelgicaBelgiumLevel VI: descriptive study
Somerson et al.,28 2018“Quality of Internet-Based Decision Aids for Shoulder Arthritis: What Are Patients Reading?”BMC Musculoskeletal DisordersUnited StatesLevel VI: descriptive study
Starman et al.,31 2010“Quality and Content of Internet-Based Information for Ten Common Orthopaedic Sports Medicine Diagnoses”Journal of Bone and Joint Surgery—American volumeUnited StatesLevel VI: descriptive study
Wang et al.,29 2017“Evaluation of the Quality, Accuracy, and Readability of Online Patient Resources for the Management of Articular Cartilage Defects”CartilageUnited StatesLevel VI: descriptive study
Zhang et al.,30 2016“The Quality and Readability of Internet Information Regarding Clavicle Fractures”Journal of Orthopaedic ScienceUnited StatesLevel VI: descriptive study
Summary of Included Studies

Internet Search Engines

Among the included studies, Google (86%),,,,17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31 Bing (76%),,,17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30 and Yahoo (76%),,,19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, were the most commonly assessed search engines and were used for both website and image searches (Appendix Table 1).
Appendix Table 1

Search Engines Assessed in Included Studies

AskBingGoogleYahooYouTubeOther
Study authors
 Akinleye et al.,12 2018X
 Akpolat and Kurdal,13 2020X
 Bruce-Brand et al.,14 2013XXXX
 Cassidy et al.,15 2018X
 Celik et al.,16 2020X
 Dalton et al.,17 2015XXXXAOL
 DeFroda et al.,18 2019XX
 DeFroda et al.,19 2018XXX
 Devitt et al.,20 2017XXXAOL, Lycos
 Duncan et al.,21 2013XXXX
 Dy et al.,5 2012XXX
 Garcia et al.,22 2014XXX
 Goldenberg et al.,23 2019XXX
 Gosselin et al.,24 2013XXX
 Houck et al.,25 2019XXX
 Nwachukwu et al.,26 2018XXX
 O'Neill et al.,27 2014XXX
 Somerson et al.,28 2018XXX
 Starman et al.,31 2010XX
 Wang et al.,29 2017XXX
 Zhang et al.,30 2016XXX
Total316181633
%147686761414

Video

YouTube, the second most popular social media network, was the only search engine used to assess videos. Only 2 studies discussed the video medium: those of Akpolat and Kurdal and Cassidy et al. Cassidy et al. reported no correlation between the number of views and video quality or accuracy based on any scoring system.

Images

DeFroda et al., discussed the image medium. In their analysis of internet images based on knee ligament search terms, they found that the inter-rater reliability was high (Cronbach α = 0.89) for “PCL tear” (posterior cruciate ligament tear) searched on Bing and nearly equivalent (Cronbach α > 0.9) for the remainder of the search queries (ACL tear, MCL tear, and LCL [lateral collateral ligament] tear). When then compared Google with Bing, the only significant difference was in the ACL group. Bing returned a significantly greater number of correct images: 60% compared with Google’s 45% (P = .034). Otherwise, for MCL and LCL (lateral collateral ligament) tear searches, Google and Bing were not statistically significantly different. In their study assessing meniscal images, DeFroda et al. found that search engines displayed meniscal tears with greater than 80% accuracy but that many of the images were technical and required additional education in anatomy and physiology to understand and interpret.

Website Affiliation

Most of the media files assessed were physician affiliated (25%), followed by news or other (15%) and industry or commercial (15%) (Appendix Table 2). Somerson et al. specifically considered source accuracy based on website type. They found that commercial websites had the most errors. When they compared academic sources with commercial sources, commercial sources had a 5 times greater chance of publishing false information. Nonprofit websites had the highest percentage of HON seals. Academic websites had the highest completeness score (19.2 ± 6.7; maximum, 49) when compared with commercial (15.2 ± 2.9), nonprofit (18.7 ± 6.8), and physician (16.6 ± 6.3) websites, indicating that even though a source may be factually correct, it could still be incomplete. This key point was highlighted by Wang et al., who found that most websites, even if considered “high quality,” failed to distinguish between focal chondral defects and diffuse osteoarthritis, an important clinical factor in an orthopaedic setting.
Appendix Table 2

Type of Data Analyzed in Each Study and Category of Information Assessed

No. of MediaPhysicianAcademicPublic Education, Not PhysicianBlogNews or OtherIndustry or Commercial
Study authors
 Akinleye et al.,12 201850 websites815198
 Akpolat and Kurdal,13 202048 videos
 Bruce-Brand et al.,14 201360 websites64629
 Cassidy et al.,15 201839 websites12216
 Celik et al.,16 202067 videos320916010
 Dalton et al.,17 201559 websites26312
 DeFroda et al.,18 2019300 images
 DeFroda et al.,19 2018300 images
 Devitt et al.,20 201781 websites3420234
 Duncan et al.,21 2013200 websites362320121045
 Dy et al.,5 201275 websites3644540
 Garcia et al.,22 201482 websites3213121411
 Goldenberg et al.,23 201947 websitesNot specified
 Gosselin et al.,24 201335 websites3721382
 Houck et al.,25 201990 websites323871111
 Nwachukwu et al.,26 201896 websites57956145
 O'Neill et al.,27 2014225 websitesNot specified
 Somerson et al.,28 201849 websites1612516
 Starman et al.,31 2010154 websites2232731674
 Wang et al.,29 201753 websites33515
 Zhang et al.,30 201691 websitesNot specified
Total1,486 websites37419986116220230
%2513681515
Average ± SD87.51 ± 55.17

SD, standard deviation.

DISCERN Instrument

Seven studies reported on the DISCERN instrument.13, 14, 15, 16, 17,, The average content-specific DISCERN score was 5.24, whereas the average non–content-specific DISCERN score was 40.55. The average FK grade level was 10.24, with scores ranging from 7.9 to 13.4. The average FK readability score was 52.94 (“fairly difficult, high school”). The average JAMA Benchmark score was 2.00. The average percentage of websites with HON certifications was 17.79%. Akpolat and Kurdal and Cassidy et al. reported YouTube content-specific DISCERN scores (2.35 ± 0.91 and 2.30 ± 0.9, respectively). Celik et al. reported an average DISCERN score on YouTube of 30.5 ± 13.9. Dalton et al. reported DISCERN scores on Ask, Bing, Google, Yahoo, and AOL averaging 39.47 ± 11.39. Devitt et al. reported overall and content-specific DISCERN scores across Bing, Google, Yahoo, AOL, and Lycos (Brightcom Group, Hyderabad, Telangana, India) (overall scores of 37.3 ± 3.4 for anterolateral ligament reconstruction vs 54.4 ± 4.6 for ACL reconstruction, P < .0001; content-specific scores of 5.3 ± 1.3 vs 11.0 ± 1.5, P < .0001). Houck et al. reported an average content-specific DISCERN score across Bing, Google, and Yahoo of 3.4 ± 0.59.

FK Readability Test Tool

Nine studies reported on the FK readability test tool,,,,,,,,: Akinleye et al., Dalton et al., Dy et al., Garcia et al., Gosselin et al., Nwachukwu et al., O'Neill et al., Wang et al., and Zhang et al. reported on grade level. Scores ranged from 7.9 to 13.4, with an average score of 10.24. Akinleye et al., Dalton et al., Gosselin et al., and O'Neill et al. reported on FK readability. Scores ranged from 47.40 to 54.60, with an average score of 52.94 (“fairly difficult, high school”).

Health on the Net

Seven studies reported on the HON foundational principles.,,,,,, The average percentage of websites with HON certifications was 17.79%. Two studies searched Ask., Seven searched both Google and Bing.,,,,,, Three searched Yahoo.,, Two searched AOL., One searched Lycos. None searched YouTube (Appendix Table 1).

JAMA Benchmark Score

Seven studies reported on the JAMA Benchmark score.13, 14, 15, 16, 17,, The average JAMA Benchmark score was 2.00.

Discussion

In this systematic review evaluating internet-based guidance for common orthopaedic sports medicine diagnoses, most search engines preferentially populate media that lacks appropriate scientific and medical screening; the best predictor for unbiased information was the presence of an HON seal and lack of third-party affiliation (i.e., financial incentive). The results of this review, along with findings presented by Akinleye et al., support that the most frequently accessed websites exceed the reading-ease recommendations set by the American Medical Association and National Institutes of Health. Yet, as Zhang et al. reported, the use of more complex search terms provided websites with information of a higher reading grade level but not of higher quality. Most of the websites that populate the internet when searching frequently used orthopaedic terms and diagnoses are not associated with an HON seal, meaning they are not approved for accuracy, completeness, or reliability. Many of the images that appear when searching clinical diagnoses do not align with the actual term used in the search. Finally, most videos available are non-educational and miss key clinical information. This inconsistency highlights that there exists great variability in the major search engines. In support of the findings of Bruce-Brand et al., many of the studies in this review mentioned that health care information online frequently omits treatment options, such as doing nothing—a key feature in the DISCERN scoring rank, risks, and prognosis. Nonetheless, website accuracy, reading level, and the presence of an HON seal were positively correlated., Websites with a seal had higher overall DISCERN and JAMA Benchmark scores. This review emphasizes that there are very few checkpoints ensuring that medical information on the internet is vetted for safety and correctness. The 21 studies in this review stressed that awareness and use of search engines for health purposes are growing in popularity, but the general public lacks literacy regarding source credibility, which could lead to adverse health outcomes, delayed treatment, and potential exacerbation of a condition or injury. In summary, the findings of this systematic review suggest that physicians can mitigate the discrepancy in health literacy and internet information by taking an active role in guiding patients. Health care providers are in a unique position and can encourage the use of websites with HON seals and encourage patients to refrain from self-diagnosis and self-treatment based on the guidance of the internet.

Future Directions

The problem of a physician having to prove or disprove a patient’s online diagnosis and presumed treatment merits continued analysis. Future studies should consider patient interaction with the internet and its impact on clinic visits, the added burden encountered by physicians, and potential correlations between internet use and physician visits.

Limitations

In this study, only complete data available on the day of the search were analyzed. Therefore, variables outside the scope of the initial search, such as standardized methodologies (several studies used their own scoring tools to evaluate website content),,, direct implications for patients, and clinical care correlations, do not have data available for comparison. Only 2 studies looked at images, and both of those only focused on the knee, meaning there is a lack of information available on the shoulder and clavicle—other commonly injured parts., The only video streaming medium used was YouTube, which has additional commercial bias given that it is a social media platform. Plus, the specifications on the algorithm used by each specific search are not available and could significantly impact the results that appear. Additionally, we cannot definitively know all search-user characteristics, intentions, and biases when evaluating for a systematic review. There are limits to the generalizability of this study given that the major search engines analyzed (Google, Bing, Yahoo, AOL, and Ask) constantly undergo updates and changes to how they search, their advertisements and sponsors, and what is deemed relevant based on user and computer data. In fact, these changes over time are not well documented, and this could impact search results in every domain. Finally, there are no well-established tools used to rank health-based information that can be translated across all media forms: text, images, and videos—the closest certification for information vetting is an HON seal.

Conclusions

The results of this systematic review suggest that physicians should improve the quality of online information and encourage patients to access credible sources when conducting their own research. Doctors can and should play an active role in closing the gap between the level of health literacy of their patients and that of most common online resources.
  31 in total

1.  Does the quality, accuracy, and readability of information about lateral epicondylitis on the internet vary with the search term used?

Authors:  Christopher J Dy; Samuel A Taylor; Ronak M Patel; Moira M McCarthy; Timothy R Roberts; Aaron Daluiski
Journal:  Hand (N Y)       Date:  2012-12

2.  Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement.

Authors:  David Moher; Alessandro Liberati; Jennifer Tetzlaff; Douglas G Altman
Journal:  J Clin Epidemiol       Date:  2009-07-23       Impact factor: 6.437

3.  Evaluation of the Quality, Accuracy, and Readability of Online Patient Resources for the Management of Articular Cartilage Defects.

Authors:  Dean Wang; Rohit G Jayakar; Natalie L Leong; Michael P Leathers; Riley J Williams; Kristofer J Jones
Journal:  Cartilage       Date:  2016-06-23       Impact factor: 4.634

4.  Accuracy of internet images of ligamentous knee injuries.

Authors:  Steven F DeFroda; Steven L Bokshan; Emil Stefan Vutescu; Kayleigh Sullivan; Brett D Owens
Journal:  Phys Sportsmed       Date:  2018-10-10       Impact factor: 2.241

5.  An assessment of the readability and quality of elective orthopaedic information on the Internet.

Authors:  Shane C O'Neill; Matthew Nagle; Joseph F Baker; Fiachra E Rowan; Sean Tierney; John F Quinlan
Journal:  Acta Orthop Belg       Date:  2014-06       Impact factor: 0.500

6.  Readability of the Most Commonly Accessed Arthroscopy-Related Online Patient Education Materials.

Authors:  Sheriff D Akinleye; Ryan Krochak; Nicholas Richardson; Garret Garofolo; Maya Deza Culbertson; Orry Erez
Journal:  Arthroscopy       Date:  2017-12-26       Impact factor: 4.772

7.  Online resources for shoulder instability: what are patients reading?

Authors:  Grant H Garcia; Samuel A Taylor; Christopher J Dy; Alexander Christ; Ronak M Patel; Joshua S Dines
Journal:  J Bone Joint Surg Am       Date:  2014-10-15       Impact factor: 5.284

8.  Evaluation of information available on the internet regarding reverse total shoulder arthroplasty.

Authors:  Darby A Houck; Matthew J Kraeutler; John W Belk; Eric C McCarty; Jonathan T Bravman
Journal:  Shoulder Elbow       Date:  2017-06-13

Review 9.  Top orthopedic sports medicine procedures.

Authors:  Sebastiano Vasta; Rocco Papalia; Erika Albo; Nicola Maffulli; Vincenzo Denaro
Journal:  J Orthop Surg Res       Date:  2018-07-31       Impact factor: 2.359

10.  The Quality of Online Resources Available to Patients Interested in Knee Biologic Therapies Is Poor.

Authors:  Benedict U Nwachukwu; Ryan C Rauck; Cynthia A Kahlenberg; Chukwuma Nwachukwu; William W Schairer; Riley J Williams; David W Altchek; Answorth A Allen
Journal:  HSS J       Date:  2018-08-08
View more
  1 in total

1.  Proximal Humerus Fracture: An Evaluation of the Readability and Value of Web-Based Knowledge.

Authors:  Mohamed Elshohna; Yasir Hidayat; Ahmed Karkuri
Journal:  Cureus       Date:  2022-08-13
  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.