Literature DB >> 35428646

Quality and reliability evaluation of online videos on carpal tunnel syndrome: a YouTube video-based study.

Donghee Kwak1, Jong Woong Park1, Yousun Won2, Yeongkeun Kwon3, Jung Il Lee4.   

Abstract

OBJECTIVES: With the increasing popularity of searches for medical information on YouTube, the availability of videos concerning carpal tunnel syndrome (CTS) is increasing. This study aimed to evaluate the quality and reliability of YouTube videos on CTS. SETTING AND PARTICIPANTS: No participants were included. PRIMARY AND SECONDARY OUTCOME MEASURES: We searched YouTube on 1 April 2021 using the keywords "carpal tunnel syndrome" and "carpal tunnel release" and evaluated the first 55 retrieved videos. We summarised the video characteristics including Video Power Index (VPI), which was designed to evaluate video popularity based on the number of likes and views. We categorised them based on source and content. Video quality and reliability were evaluated using the Journal of the American Medical Association (JAMA) benchmark criteria, the Global Quality Score (GQS) and the Carpal Tunnel Syndrome-Specific Score (CTS-ss) .
RESULTS: The mean (range: minimum-maximum) of JAMA scores, GQS and CTS-ss were 2.13 (1-4), 2.69 (1-5), and 5.0 (1-15), respectively. The most common source of video was from allied health workers, and academically sourced videos had the highest JAMA score and GQS. Three scores were significantly correlated with each other. Multiple linear regression analysis showed that a higher JAMA score was associated with a higher likes ratio, and a higher GQS was associated with a longer video running time and greater number of comments. However, a higher VPI was not associated with higher video quality or reliability represented by the three scores.
CONCLUSIONS: YouTube videos on CTS have low quality and reliability. Video popularity was not significantly correlated with quality or reliability. Our findings suggest that expert groups should provide and promote high-quality video content to YouTube users and patients. © Author(s) (or their employer(s)) 2022. Re-use permitted under CC BY-NC. No commercial re-use. See rights and permissions. Published by BMJ.

Entities:  

Keywords:  EDUCATION & TRAINING (see Medical Education & Training); Hand & wrist; NEUROLOGY; Neurological pain; REHABILITATION MEDICINE

Mesh:

Year:  2022        PMID: 35428646      PMCID: PMC9014065          DOI: 10.1136/bmjopen-2021-059239

Source DB:  PubMed          Journal:  BMJ Open        ISSN: 2044-6055            Impact factor:   3.006


Various characteristics including number of views, number of likes, Video Power Index and video uploader in the YouTube videos about carpal tunnel syndrome were investigated. The reliability and quality of videos were investigated using three scoring systems: Journal of the American Medical Association benchmark criteria, Global Quality Score and Carpal Tunnel Syndrome-Specific Score. Although these scoring systems are subjective and unvalidated, the scoring systems were independently assessed twice by the two raters, which showed intraobserver and interobserver agreements determined by intraclass correlation coefficients. A multiple linear regression analysis was performed to identify video characteristics affecting the reliability and quality of videos.

Background

With the internet penetration rate exceeding 50% worldwide,1 searches for health information on the internet have become common. According to recent studies, 80% of internet users searched for health information online,2 and up to 30% of orthopaedic patients searched online for disease information.3 Furthermore, well-designed videos of disease information positively affect treatment outcomes by improving patient comprehension.4 5 However, most online information is not regulated, resulting in the spread of inaccurate and low-quality data among patients.6–10 Therefore, physicians should properly evaluate such information and help patients obtain accurate information and appropriate treatment. YouTube, which has over 1 billion users watching over 1 billion hours of videos each day, is a source of representative video-based educational content.11 Although some high-quality orthopaedic content is uploaded by qualified experts on YouTube, most of the related content is uploaded by unqualified individuals, providing patients inaccurate and erroneous information. In previous quality-evaluating studies in the orthopaedic field, YouTube video accuracy and quality were low.1 10 12–14 According to previous studies that investigated the quality of carpal tunnel syndrome (CTS) information provided by internet search engines,15–17 the quality of online information has improved over the past decade but remains low. These studies reported that there was significant scope for improvement. In contrast, recent studies18 19 reported that most YouTube videos and websites that provide information on CTS can reinforce misconceptions. Two quality-evaluating studies on CTS information available on YouTube20 21 focused on video quality and reliability, and neither examined the relationship between characteristics such as video popularity and quality. The current study aimed to (1) evaluate the quality and reliability of YouTube videos concerning CTS; (2) investigate the video characteristics, sources and contents; and (3) determine the relationship between video characteristics and quality.

Methods

Patient and public involvement

No patients were involved.

YouTube search design and study setting

The YouTube online library (https://www.youtube.com) was searched on 1 April 2021 using the terms “carpal tunnel syndrome” and “carpal tunnel release”. The first 50 videos retrieved based on each keyword and sorted by ‘view count’ for a total of 100 videos were selected for review. Of them, 45 were excluded (duplicates, 39; non-English, 3; information on cubital tunnel syndrome, 2; and soundtrack with no mention of carpal tunnel, 1). Thus, 55 YouTube videos found using the keywords “carpal tunnel syndrome” and “carpal tunnel release” were analysed (figure 1). The URLs of each video are listed in online supplemental table 1.
Figure 1

Search methodology for carpal tunnel syndrome-related YouTube videos.

Search methodology for carpal tunnel syndrome-related YouTube videos. Data on the following video characteristics were collected from each YouTube video: (1) title, (2) channel name, (3) number of subscribers, (4) video running time, (5) number of views, (6) number of comments, (7) video source/uploader, (8) content type, (9) days since upload, (10) view ratio (number of views:days since upload), (11) number of likes, (12) number of dislikes, (13) likes ratio (likes×100/[likes+dislikes]) and (14) Video Power Index (VPI). The VPI was calculated using the following formula: like ratio×view ratio/100. This value is an index designed to evaluate video popularity based on the number of likes and views.1 Video sources/uploaders were categorised as follows1 10: (1) academic (uploaders affiliated with universities or research groups); (2) physicians (individual physicians or physician groups not affiliated to a university or research institute); (3) non-physicians (allied health workers such as alternative medical providers, physiotherapists and occupational therapists); (4) trainers; (5) medical sources (animations or related content from health websites); (6) patients; and (7) commercial. Contents were categorised as follows: (1) exercise training; (2) disease information; (3) patient experience; (4) surgical technique; (5) non-surgical management, such as chiropractic treatment; and (6) advertisement.

Evaluation of video quality and reliability

The quality and reliability of YouTube videos were assessed using three scoring systems: the Journal of the American Medical Association (JAMA) benchmark criteria, the Global Quality Score (GQS) and the Carpal Tunnel Syndrome-Specific Score (CTS-ss). The JAMA criteria enable a non-specific assessment of content reliability and include four criteria (table 1).22 Each criterion is assigned 1 point for a maximum total of 4 points. A score of 0 indicates low video reliability and accuracy, whereas a score of 4 indicates high video reliability and accuracy. The GQS1 10 23 consists of five grades and provides a non-specific assessment of health-related website quality (table 2). The total GQS ranges from 1 to 5, with a higher score indicating better educational quality. To better evaluate quality and accuracy of YouTube videos concerning CTS, we employed the new CTS-ss, which consists of 20 items. We generated this scoring system based on recent review articles24–26 and guidelines published by the American Academy of Orthopaedic Surgeons,27 which were considered reasonable in previous studies.9 10 The CTS-ss evaluates information on (1) patient symptoms and population, (2) carpal tunnel anatomy, (3) CTS diagnosis and evaluation, (4) treatment options, and (5) postoperative care and course (box 1). One point was given for each of the 20 items for a total maximum of 20 points. Higher scores indicated higher CTS-specific educational value.
Table 1

Journal of the American Medical Association benchmark criteria22

CriterionDescription
AuthorshipAuthor and contributor credentials and their affiliations should be provided.
AttributionAll copyright information should be clearly listed, and references and sources for content should be stated.
CurrencyThe initial date of posted content and dates of subsequent updates to content should be provided.
DisclosureConflicts of interest, funding, sponsorship, advertising, support and video ownership should be fully disclosed.
Table 2

Global Quality Score criteria1 10 23

GradeDescription of quality
1Poor quality, information missing, technique misleading; unlikely to be useful for patient education
2Generally sparse quality, some information provided but majority lacking, technique poor; limited use for patients
3Moderate quality, important information provided but some lacking, technique mostly adequate; somewhat useful for patients
4Good quality, majority of information provided but some information lacking, technique adequate; useful for patients because most important topics are covered
5Excellent quality, full information provided, technique adequate; highly useful for patients
Journal of the American Medical Association benchmark criteria22 Global Quality Score criteria1 10 23 Describes symptoms (eg, nocturnal paraesthesia, loss of sensation, thenar muscle atrophy). Describes patient population, especially high prevalence in older women. Describes carpal tunnel anatomy and/or function. Mentions caused by nerve compression. Describes risk factors (eg, diabetes, hypothyroidism, pregnancy and repetitive use). Mentions physical examination and findings (eg, Tinel’s sign and Phalen’s manoeuvre). Discusses electrophysiological tests. Discusses additional diagnostic tests (eg, ultrasound and MRI). Mentions patient-centred measures (eg, the Boston Carpal Tunnel Syndrome Questionnaire). Discusses differential diagnosis (eg, cervical radiculopathy). Describes non-surgical treatment, especially changes in habits. Mentions that laser therapy is one of the non-surgical options. Mentions pharmacotherapy (eg, local corticosteroid injection, NSAIDs). Mentions musculoskeletal manipulation and/or splinting. Describes surgical treatment that is the most effective treatment. Mentions open carpal tunnel release. Mentions endoscopic carpal tunnel release. Describes complications and outcomes (eg, CRPS, scar tenderness, reoperation). Mentions need for postoperative physical therapy. Outlines return-to-function timeline.

Intraobserver reliability and interobserver agreement assessment

All three scoring systems (JAMA, GQS and CTS-ss) were independently assessed twice, 30 days apart, by two raters, consisting of one orthopaedic surgeon (DK) and one family medicine doctor (YK). Intraobserver and interobserver agreements were determined using intraclass correlation coefficients (ICCs). ICCs for absolute agreement with a single measurement were used to identify intraobserver reliability with a two-way mixed-effects analysis of variance models. ICCs for absolute agreement with a single rater were used to identify interobserver agreement using two-way random-effects analysis of variance models. A guideline28 for evaluating ICC values was adopted: excellent (>0.90), good (0.75–0.90), moderate (0.50–0.75) and poor (<0.50). In cases of disagreement, all authors re-evaluated the video in question until consensus was reached.

Statistical analysis

Continuous variables are presented as mean±SD. Differences in the JAMA score, GQS, CTS-ss and VPI according to (1) video upload source and (2) category of video contents were evaluated by one-way analysis of variance tests (for normally distributed data) and Kruskal-Wallis tests (for non-normally distributed data) followed by post hoc tests using the Bonferroni method. A Spearman correlation analysis was used to assess the correlation between scores and between video characteristics and scores. A multiple linear regression analysis was performed to identify video characteristics affecting the JAMA score, GQS, CTS-ss and VPI. All reported p values were two-sided, and those <0.05 were considered statistically significant.

Results

Video characteristics and quality scores

The mean JAMA score, GQS and CTS-ss were 2.13, 2.69 and 5.0, respectively, indicating low reliability and educational quality (table 3). Raw scores of JAMA score and CTS-ss are shown in online supplemental table 2. Non-physician video sources accounted for the largest share (29.09%), while commercial sources accounted for the lowest share (5.45%) (figure 2). Disease-specific information accounted for the largest share (32.73%), while patient experience accounted for the smallest share (3.64%) (figure 3). The video title, YouTube channel name, JAMA score, GQS, CTS-ss and VPI of the top 55 videos are listed in order of the number of views in figure 4.
Table 3

Characteristics of 55 YouTube videos about carpal tunnel syndrome

VariableValue
Number of subscribers742 791.7±1 183 968
Video running time (s)400.71±271.91
Number of views1 559 722±7 629 661
Number of days since upload2450.27±1250.96
Number of comments316.75±332.4
Number of likes5184.51±4804.72
Number of dislikes242.8±421.93
View ratio478.77±1506.85
Like ratio92.81±7.39
VPI382.9±910.34
JAMA scores2.13±0.94
GQS2.69±1.17
CTS-ss5.0±3.29

Data are presented as mean±SD.

Formulas: view ratio, number of views/days since upload; like ratio, number of likes×100/[number of likes+number of dislikes]; VPI, like ratio×view ratio/100.

CTS-ss, Carpal Tunnel Syndrome-Specific Score; GQS, Global Quality Score; JAMA, Journal of the American Medical Association; VPI, Video Power Index.

Figure 2

Categorical distribution of video source.

Figure 3

Categorical distribution of video content.

Figure 4

Data-bar visualisation of the top 55 carpal tunnel syndrome and release videos with the highest number of views. CTS-ss, Carpal Tunnel Syndrome-Specific Score; GQS, Global Quality Score; JAMA, Journal of the American Medical Association; VPI, Video Power Index.

Characteristics of 55 YouTube videos about carpal tunnel syndrome Data are presented as mean±SD. Formulas: view ratio, number of views/days since upload; like ratio, number of likes×100/[number of likes+number of dislikes]; VPI, like ratio×view ratio/100. CTS-ss, Carpal Tunnel Syndrome-Specific Score; GQS, Global Quality Score; JAMA, Journal of the American Medical Association; VPI, Video Power Index. Categorical distribution of video source. Categorical distribution of video content. Data-bar visualisation of the top 55 carpal tunnel syndrome and release videos with the highest number of views. CTS-ss, Carpal Tunnel Syndrome-Specific Score; GQS, Global Quality Score; JAMA, Journal of the American Medical Association; VPI, Video Power Index.

Differences in video reliability and quality by source and content

The JAMA score (p<0.0001) and GQS (p=0.0004) differed significantly among the seven groups of video sources, with videos from academic and physician sources having the highest mean JAMA scores and GQS (table 4). The JAMA score (p=0.0077) and GQS (p=0.0018) differed significantly among the six groups of video content, with videos about surgical technique and disease-specific information having the highest mean JAMA scores and GQS. However, the CTS-ss and VPI did not differ significantly between the groups based on video sources and contents.
Table 4

Mean quality and reliability scores per video source and video content variable

Grouping variableJAMA scoreGQSCTS-ssVPI
Video source
 Academic3.38±0.743.63±1.066.12±5.01077.92±2324.16
 Physician2.7±0.823.5±1.186.4±3.24156.50±79.12
 Non-physician2.0±0.522.43±0.734.13±2.28314.65±204.90
 Trainer1.25±0.51.5±0.583.0±2.31243.20±157.61
 Medical1.7±0.822.7±1.255.6±3.41371.63±370.09
 Patient1.25±0.51.25±0.52.25±0.5172.21±127.05
 Commercial1.33±0.582.33±0.586.33±3.06152.93±122.48
 P value*<0.00010.00040.13060.4234
 Significant difference in post hoc analysis†Academic versus non-physician, trainer, medical, patient, commercial;Physician versus trainer, medical, patient, commercialAcademic versus trainer, patient;Physician versus trainer, patient
Video content
 Exercise training1.73±0.791.91±0.833.09±1.97344.15±266.65
 Disease-specific2.33±0.843.17±1.046.22±3.54227.41±161.24
 Patient experience1.5±0.711.5±0.712.5±0.71133.82±109.52
 Surgical technique2.83±1.113.42±1.165.92±3.65724.92±1917.21
 Non-surgical1.63±0.522.13±1.134.13±2.64396.44±367.10
 Advertisement1.5±0.582.25±0.55.0±3.65260.57±237.37
 P value‡0.00770.00180.08970.3493
 Significant difference in post hoc analysis†Surgical technique versus exercise training, non-surgicalDisease-specific, surgical technique versus exercise training

Data are presented as mean±SD.

*For the video source group, significant differences were seen in JAMA score and GQS.

†Post hoc tests were performed using Bonferroni’s method.

‡For the video content group, significant differences were seen in JAMA score and GQS.

CTS-ss, Carpal Tunnel Syndrome-Specific Score; GQS, Global Quality Score; JAMA, Journal of the American Medical Association; VPI, Video Power Index.

Mean quality and reliability scores per video source and video content variable Data are presented as mean±SD. *For the video source group, significant differences were seen in JAMA score and GQS. †Post hoc tests were performed using Bonferroni’s method. ‡For the video content group, significant differences were seen in JAMA score and GQS. CTS-ss, Carpal Tunnel Syndrome-Specific Score; GQS, Global Quality Score; JAMA, Journal of the American Medical Association; VPI, Video Power Index.

Factors affecting video quality and popularity

JAMA, GQS and CTS-ss significantly correlated with each other (JAMA score vs GQS, p<0.001; JAMA score vs CTS-ss, p=0.001; GQS vs CTS-ss, p<0.001). However, the VPI was not significantly correlated with the three scores. Multiple linear regression analysis showed that a higher JAMA score was associated with a higher likes ratio of an academic or physician upload source compared with a patient upload source (table 5). A higher GQS was associated with a longer video running time; greater number of comments; and higher probability of academic, physician, non-physician, medical information and commercial upload source than of patient upload source. A higher CTS-ss was more associated with academic, physician, medical information, and commercial upload sources than patient upload sources. However, a higher VPI was not associated with higher video quality or reliability scores.
Table 5

Multiple linear regression analysis of correlations between video characteristics and the VPI, JAMA score, GQS and CTS-ss

VariableUnstandardised beta (B)95% CIStandardised βP value
VPI (R2=0.997)
 Days since upload−0.039−0.058 to −0.02−0.053<0.001
 View ratio0.5950.576 to 0.6140.985<0.001
 Number of likes14.1186.808 to 21.4280.075<0.001
JAMA score (R2=0.626)
 Like ratio0.0540.001 to 0.1070.4240.045
Video source
 Academic2.1261.164 to 3.0880.801<0.001
 Physician1.1870.239 to 2.1360.490.015
GQS (R2=0.561)
 Video running time0.0010 to 0.0020.2520.044
 Number of comments0.0020 to 0.0030.4610.029
Video source
 Academic3.0251.735 to 4.3150.921<0.001
 Physician2.4651.193 to 3.7360.821<0.001
 Non-physician1.5960.337 to 2.8560.6260.014
 Medical1.8780.661 to 3.0940.6250.003
 Commercial1.8740.32 to 3.4290.3680.019
CTS-ss (R2=0.356)
Video source
 Academic6.2251.825 to 10.6240.6730.007
 Physician5.1740.838 to 9.510.6120.021
 Medical4.9780.828 to 9.1280.5890.02
 Commercial6.4301.13 to 11.7310.4480.019

CTS-ss, Carpal Tunnel Syndrome-Specific Score; GQS, Global Quality Score; JAMA, Journal of the American Medical Association; VPI, Video Power Index.

Multiple linear regression analysis of correlations between video characteristics and the VPI, JAMA score, GQS and CTS-ss CTS-ss, Carpal Tunnel Syndrome-Specific Score; GQS, Global Quality Score; JAMA, Journal of the American Medical Association; VPI, Video Power Index. The intraobserver reliability of the two raters was excellent for the JAMA score, GQS and CTS-ss. The interobserver agreement between raters was good for the JAMA score (ICC 0.881, 95% CI 0.804 to 0.929), good for the GQS (ICC 0.881, 95% CI 0.804 to 0.929) and excellent for the CTS-ss (ICC 0.941, 95% CI 0.898 to 0.966).

Discussion

This study demonstrated that the reliability and quality of YouTube videos concerning CTS were low. This result was consistent with that of other previously conducted YouTube video quality evaluation studies.1 10 13 20 21 29–31 Mert and Bozgeyik20 evaluated the quality of CTS videos on YouTube and reported that the video reliability and quality were low. They presented no significant relationship between video characteristics, reliability and quality evaluation scoring systems. Radonjic et al21 also evaluated CTS videos on YouTube and showed low reliability and quality and found that videos uploaded by physicians had significantly higher reliability and quality evaluation scores than those uploaded by non-physicians. Goyal et al18 reported that YouTube videos of CTS have low information quality. They determined that the potential reinforcement of misconceptions is prevalent in YouTube videos on CTS. Although the overall reliability and educational quality of YouTube videos were low, those of videos from academic and physician uploaders or about surgical techniques and disease-specific information were significantly higher than those of other video sources and contents. This is because the main purpose of these video sources and contents is to educate doctors, medical students and patients. In contrast, the CTS-ss did not differ significantly among the video sources and contents because YouTube videos focus on specific topics, such as symptoms and surgical technique or rehabilitation after surgical treatment, and deliver the content within a short running time. Additionally, some specific channels, such as the ‘Bob & Brad’ channel, posted videos in four series about CTS and release. Casual YouTube viewers cannot obtain sufficient content on CTS and release in only one or two posted videos, but an entire series can provide most of the content. YouTube uploaders usually post short videos of less than 10 min to maximise the number of views and user interest; thus, they split the content into several videos. Most of the videos had low reliability and educational quality, but some videos had useful practicality and educational information. The ‘Carpal Tunnel Syndrome - Everything You Need To Know - Dr. Nabil Ebraheim’ video of the nabil ebraheim channel explains the overall symptoms, anatomy and risk factors of CTS. In the ‘Surgery Video: Carpal Tunnel - MedStar Union Memorial’ video of the MedStar Health channel, the surgical procedure and method of endoscopic carpal tunnel release were shown in detail. The ‘How to Determine If You Really Have Carpal Tunnel Syndrome - Dr Mandell, DC’ video of the motivationaldoc channel shows the physical examination required for CTS diagnosis. In this study, video popularity showed no significant correlation with reliability or quality. Popular videos that casual YouTube users and patients frequently watch do not have good quality and reliability. Interestingly, YouTube videos of expert groups that are expected to have high reliability and quality, such as the American Academy of Orthopaedic Surgeons or Federation of European Societies for Surgery of the Hand, were not included in the top 55 videos. A manual search identified only about 1600 views for the carpal tunnel release video uploaded to the American Academy of Orthopaedic Surgeons YouTube channel (https://www.youtube.com/watch?v=eemuH5UYElo). Additionally, the Federation of European Societies for Surgery of the Hand and the British Society for Surgery of the Hand channels have no CTS-related videos and only 154 and 575 subscribers, respectively. It is necessary to promote an expert group’s YouTube videos and channels and try to provide accurate medical information by uploading a high-quality video and exposing it to casual YouTube users and patients. In a previous study on the meniscus,10 video dislikes were described as predictors of YouTube video reliability, but this was not the case in this study. The independent predictor of the JAMA score in this study was the likes ratio. Furthermore, independent predictors of GQS were video running time and number of comments, suggesting that videos with a longer running time and greater number of comments are independently and significantly associated with a higher GQS. The longer the video running time, the greater the amount of information it contains; therefore, its educational quality also increases. For GQS, a greater number of comments contain more useful information for users who watched the video. Regarding the CTS-ss, compared with patient upload sources, academic, physician, medical and commercial upload sources are associated with a higher CTS-ss. However, unlike the JAMA score and GQS, CTS-ss showed no significant association with video characteristics except for video source. Our study has several limitations. First, we searched the top 50 videos for “carpal tunnel syndrome” and “carpal tunnel release” on YouTube in the order of popularity. This search strategy missed certain videos with low views or hits but with potentially high quality. Although our search strategy could miss high-quality videos that are less ‘popular’, this strategy is the actual method by which casual YouTube users obtain information. Second, YouTube video metrics such as the number of likes and views are constantly updated; therefore, these study data are accurate only on the date of the search. Third, the assessment scoring systems that we used (the JAMA score, GQS and CTS-ss) are subjective and unvalidated. Because the JAMA benchmark criteria were developed to assess medical information on the internet website rather than video information, the criteria may not fit YouTube videos. The CTS-ss includes many contents of CTS, but almost YouTube videos have a short duration of about 10 min or less. Thus, it tends to be difficult to present all checklist of CTS-ss in short videos. Because some criteria in the JAMA benchmark criteria and CTS-ss were unsatisfied in most of videos, total score may be mainly influenced by some criteria; thus, all criteria have no equal weight. Nevertheless, we have no choice but to use these scoring systems due to lack of validated scoring system for evaluating the quality and reliability of medical information in YouTube videos. The excellent interobserver and intraobserver reliability were confirmed using ICCs to redeem these shortcomings. In addition, the GQS may be highly subjective; thus, we tried to resolve the subjectivity by having two independent authors perform each evaluation twice. Fourth, one video entitled ‘Podcast: See a Live Surgery for Carpal Tunnel Syndrome’ has the dominant number of views (66.5%), so the average views and VPI values tended to increase. We tried to buffer this dominance by analysing 55 videos.

Conclusions

This study demonstrated that YouTube videos of CTS showed low reliability and quality. Video quality is significantly associated with content and upload source. Video popularity was not correlated with video reliability or quality, which suggests that a good content quality does not guarantee video popularity. The impact of videos on patient care cannot be underestimated. To ensure the spread of accurate information, it is necessary to have YouTube videos published by expert groups and to strive to provide high-quality video materials that can assist with patient diagnosis and treatment.
  28 in total

1.  Evaluating the source and content of orthopaedic information on the Internet. The case of carpal tunnel syndrome.

Authors:  P K Beredjiklian; D J Bozentka; D R Steinberg; J Bernstein
Journal:  J Bone Joint Surg Am       Date:  2000-11       Impact factor: 5.284

Review 2.  YouTube as an information source for femoroacetabular impingement: a systematic review of video content.

Authors:  Matthew G MacLeod; Daniel J Hoppe; Nicole Simunovic; Mohit Bhandari; Marc J Philippon; Olufemi R Ayeni
Journal:  Arthroscopy       Date:  2014-08-20       Impact factor: 4.772

3.  Educational quality of YouTube videos on knee arthrocentesis.

Authors:  Jonas Fischer; Jeroen Geurts; Victor Valderrabano; Thomas Hügle
Journal:  J Clin Rheumatol       Date:  2013-10       Impact factor: 3.517

4.  Quality and readability of online information on carpal tunnel syndrome.

Authors:  Joost T P Kortlever; Lindy Derkzen; Iris I M Kleiss; Lee M Reichel; Gregg A Vagner
Journal:  J Hand Surg Eur Vol       Date:  2019-06-25

5.  Comparing Diagnostic and Treatment Recommendations of Carpal Tunnel Syndrome Available on the Internet With AAOS Clinical Practice Guidelines.

Authors:  Jerrod Steimle; Speros Gabriel; Ryan Tarr; Brandon Kohrs; Patrick Johnston; David Martineau
Journal:  Hand (N Y)       Date:  2019-01-17

6.  YouTube provides poor information regarding anterior cruciate ligament injury and reconstruction.

Authors:  J T Cassidy; E Fitzgerald; E S Cassidy; M Cleary; D P Byrne; B M Devitt; J F Baker
Journal:  Knee Surg Sports Traumatol Arthrosc       Date:  2017-03-17       Impact factor: 4.342

7.  Evaluating the Accuracy and Quality of the Information in Kyphosis Videos Shared on YouTube.

Authors:  Mehmet Nuri Erdem; Sinan Karaca
Journal:  Spine (Phila Pa 1976)       Date:  2018-11-15       Impact factor: 3.468

8.  Assessment of "YouTube" Content for Distal Radius Fracture Immobilization.

Authors:  Abdullah Addar; Yousef Marwan; Nizar Algarni; Gregory Berry
Journal:  J Surg Educ       Date:  2017-03-27       Impact factor: 2.891

9.  Video informed consent improves knee arthroscopy patient comprehension.

Authors:  Michael J Rossi; Dan Guttmann; Megan J MacLennan; James H Lubowitz
Journal:  Arthroscopy       Date:  2005-06       Impact factor: 4.772

10.  Quality of Online Video Resources Concerning Patient Education for the Meniscus: A YouTube-Based Quality-Control Study.

Authors:  Kyle N Kunze; Laura M Krivicich; Nikhil N Verma; Jorge Chahla
Journal:  Arthroscopy       Date:  2020-01       Impact factor: 4.772

View more
  1 in total

1.  Quality of online video resources concerning patient education for neck pain: A YouTube-based quality-control study.

Authors:  Xiang Zhang; Yi Yang; Yi-Wei Shen; Ke-Rui Zhang; Li-Tai Ma; Chen Ding; Bei-Yu Wang; Yang Meng; Hao Liu
Journal:  Front Public Health       Date:  2022-09-21
  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.