Literature DB >> 30635118

[YouTube as an informational source for brachial plexus blocks: evaluation of content and educational value].

Onur Selvi1, Serkan Tulgar2, Ozgur Senturk2, Deniz I Topcu3, Zeliha Ozer2.   

Abstract

BACKGROUND AND OBJECTIVES: YouTube, the most popular video-sharing website, contains a significant number of medical videos including brachial plexus nerve blocks. Despite the widespread use of this platform as a medical information source, there is no regulation for the quality or content of the videos. The goals of this study are to evaluate the content of material on YouTube relevant to performance of brachial plexus nerve blocks and its quality as a visual digital information source.
METHODS: The YouTube search was performed using keywords associated with brachial plexus nerve blocks and the final 86 videos out of 374 were included in the watch list. The assessors scored the videos separately according to the Questionnaires. Questionnaire-1 (Q1) was prepared according to the ASRA guidelines/Miller's Anesthesia as a reference text book, and Questionnaire-2 (Q2) was formulated using a modification of the criteria in Evaluation of Video Media Guidelines.
RESULTS: 72 ultrasound-guided and 14 nerve-stimulator guided block videos were evaluated. In Q1, for ultrasound-guided videos, the least scores were for Q1-5 (1.38) regarding the complications, and the greatest scores were for Q1-13 (3.30) regarding the sono-anatomic image. In videos with nerve stimulator, the lowest and the highest scores were given for Q1-7 (1.64) regarding the equipment and Q1-12 (3.60) regarding the explanation of muscle twitches respectively. In Q2, 65.3% of ultrasound-guided and 42.8% of blocks with nerve-stimulator had worse than satisfactory scores.
CONCLUSIONS: The majority of the videos examined for this study lack the comprehensive approach necessary to safely guide someone seeking information about brachial plexus nerve blocks.
Copyright © 2018. Publicado por Elsevier Editora Ltda.

Entities:  

Keywords:  Anestesia; Anesthesia; Bloqueios do plexo braquial; Brachial plexus blocks; YouTube

Mesh:

Year:  2019        PMID: 30635118      PMCID: PMC9391886          DOI: 10.1016/j.bjan.2018.11.004

Source DB:  PubMed          Journal:  Braz J Anesthesiol        ISSN: 0104-0014


Introduction

YouTube (YouTube© www.youtube.com, YouTubeLLC, San Bruno, USA) as a referential visual guide is one of the most popular digital information sources currently available, with more than 4 billion videos watched every day. The healthcare social media listings of the Mayo Clinic name >700 health-related organizations in the United States of America alone that have a presence on YouTube. Some of the videos were prepared for healthcare providers as an educational visual guide for new interventions. However, there are no regulations or standards with respect to the educational aspects of the videos available on YouTube. Misleading information has been shared on YouTube which could pose risk to healthcare professionals or their patients. On the other hand, E-Learning methods such as video recordings have become an important part of medical education.3, 4, 5, 6 It has been proven in many studies that learning through visual sources has many advantages over conventional didactic training methods for both medical students and in other areas of medical training including regional anesthesia. Especially, in regional anesthesiology education and training, there is value in offering clear visual guidance to clarify the complex interactions of anatomy, manual skills, physiology, and clinical judgment. Several regional anesthesia institutions regularly publish comprehensive educational multimedia materials for such advanced interventions however; such proprietary materials may not achieve the dissemination impact of YouTube. Increasingly, YouTube is being recognized as a potentially useful source of healthcare education and training material. On YouTube one may easily reach information on a complex idea relayed through a basic visual format. However, it is currently possible for accurate information to be displayed on YouTube in a way that is disorganized, disjointed or even misleading. Until today, videos of neuroxial block techniques and lumbar punctures on YouTube have been assessed regarding patient safety, consistency with a scientific approach, and their quality.9, 10 The aim of this study was to evaluate YouTube videos relevant to performance and preparation of brachial plexus nerve blocks, using ASRA guidelines and Miller's Anesthesia as reference texts.11, 12

Methods

Video selection

BPNB procedures were selected as the target group for the YouTube videos to be evaluated with the concensus from the authors. The YouTube search for videos of axillary, infraclavicular, interscalene and supraclavicular nerve blocks, which are extensively used for shoulder, arm and hand surgeries, was completed on 06.10.2017 by the main investigator, an anesthesiologist with 8 years experience in anesthesia. The selected keywords were “interscalene block ultrasound”, “interscalene block nerve stimulator”, “supraclavicular block ultrasound”, “supraclavicular block nerve stimulator”, “infraclavicular block ultrasound”, “infraclavicular block nerve stimulator”, “axillary block ultrasound”, “axillary block nerve stimulator”.

Establishment of assessor team

One consultant anesthesiologist and three experienced anesthesiologists with more than 7 years’ experience in anesthesia were selected as the assessor team. All of them can safely practice the blocks mentioned above in their daily routine and also teach them as part of a trainee education program. Two of them had of Turkish Regional Anesthesia Society ultrasound guided regional anesthesia training certificates. The main investigator, as team leader, organized a work shop. In this workshop random lower extremity regional anesthesia videos were selected from YouTube and assessors were asked to score the videos in a form which was specifically prepared for data entry. Information gathered during the workshop was used to determine any ambiguous points about data recording, scoring system and exclusion criteria, leading to clarification of the study objectives. Any disputed topics related to these elements were determined through consensus.

Watch list

The main investigator prescreened the first 100 recent and popular videos to create a “watch list” regarding each keyword while further pages were not included in the search due to the frequency of obsolete, irrelevant or repeated material. The search was limited to the first 100 videos as it was not the aim of the investigators to evaluate all videos, but only those which would have the highest possibility of being viewed. Due to the YouTube search algorithm, results of the search were listed in a single session before any changes occurred in viewer count which automatically alters the rank of the video in the display list and its popularity. Uniform Resource Locator (URL) addresses of videos were copied into an excel sheet to form the watch list. The final list was overviewed by 4 assessors and a consensus was established on the final list according to the exclusion criteria. Videos in languages other than English, with unrelated content, or that were shorter than 1 minute were excluded. Video lasting longer than 15 minutes were excluded because the length of the video may jeopardize the accuracy of the assessment due to the length.

Questionnaire 1

Q1 had 18 questions about the technique itself regarding safety, hygiene, anatomy, landmarks, complications, local anesthetic use, and equipment. Each assessor gave a score to each question grading from 1 to 5 (1 being very bad and 5 being excellent). Question 10 was asked to determine nerve-stimulator usage and question 11 and 12 were examined only for sole nerve-stimulator guided videos. In Q1 the last six questions were only answered for ultrasound-guided blocks and they were related to ultrasound image, needle manipulations, and sonographic image interpretation. Questionnaire One (Q1 – Table 1) was prepared according to the Delphi method based on the standard procedure definitions taken from the American Society of Regional Anesthesia, and Miller's Anesthesia textbook.11, 12 The four assessors filled Q1 for each video individually.
Table 1

Questionnaire 1.

Q1In which kind of operation is this block applicable? Was this information clearly explained?
Q2Was there clear explanation of the targeted skin dermatomes innervated by the nerve?
Q3Were anatomical landmarks clearly explained or marked?
Q4Were important vessels and nerve structures in close relation to the targeted nerve clearly explained?
Q5Were possible complications related to this block technique explained?
Q6Was the information on sterilization procedures clearly explained or emphasized?
Q7Was the information about nerve stimulator device and needle choice clearly explained?
Q8Was the information for skin local anesthetic infiltration (volume, name of medication) clearly explained?
Q9Was the information about local anesthetic substance clearly explained?
Q10Was the nerve stimulator used in this block?
Q11If YES, was the safe threshold level for electrical impulses clearly explained?
Q12If YES, were the muscle twitches regarding the stimulated nerve clearly explained?
Q13Were the sono-anatomic image recording and anatomical structures in the recording clear and easy to perceive?
Q14Was the ultrasound image of the needle visible and easy to follow?
Q15Were the instructions for depth, alignment and direction movements of the needle clearly explained?
Q16Was the technical information for probe selection and frequency regarding the ultrasound device explained?
Q17Was the information about in-plane or out-plane technique presented in the video?
Q18Was the information about the local anesthetic spread explained?
Questionnaire 1.

Questionnaire 2

Questionnaire two (Q2 – Table 2), which included 14 items regarding preparation and generic video quality of each video, was completed individually. In this second part, investigators evaluated the videos according to the Guidelines for the Preparation and Evaluation of Video Career Media by the American National Career Development Association (NCDA). The videos were marked from 0 to 5 (where 0 does not apply, 1 is unsatisfactory, 2 is poor, 3 is satisfactory, 4 is good, and 5 is outstanding). The final scores of each video were averaged and grouped as follows: 0–13: unsatisfactory; 14–27: poor; 28–41: satisfactory; 42–54: good; 56–70: outstanding.
Table 2

Questionnaire 2.

Q1Was the aim of video clearly stated and was it explained in the first quarter of the video?
Q2Did the title or name of the video match the aim of the video?
Q3Were the design and the content of the video suitable for a targeted educational aim?
Q4Were the skills and the technique of the procedure explained using a standard, comparable and “step by step” method?
Q5Was the information given in the video useful for viewers to develop/enhance their skill base?
Q6Was the content of the video appropriate for the health and safety of both the patient and the practitioner?
Q7Was the quality of picture regarding colors and clarity acceptable?
Q8Was the quality of video sound acceptable? (No sound should be scored as zero)
Q9Was the length of the video in balance with the content of the video?
Q10Was the information on the date of production or release, producers and the references clearly explained?
Q11Were objectives, learning tasks and terminology clearly stated in the video enabling viewers to address those tasks?
Q12Did the video have stop-and-discuss points, additional aids such as scripts and/or summarized information on procedure?
Q13Was any information given on a way to evaluate the effectiveness and reproducibility of the video?
Q14Did the content of the video stimulate viewers to make the transition from passive viewer to active practitioner in the application of the technique?
Questionnaire 2.

Final evaluation meeting for disputed videos

A consensus score was reached in a focus group meeting for those videos in which Fleiss’ Kappa scores < 0.20 in Q1. The consultant anesthesiologist was selected as moderator during the meeting. If ambiguities could not be reconciled over a video, “voting” was used to reach a final score for the debated videos.

Statistical analyses

R 3.4.3 (R Core Team; 2017). Was used for all data cleaning, analysis and visualization processes and Inter Rater Reliability (IRR) calculation.14, 15 Agreement between assessors for each video was defined according to Fleiss’ Kappa scores: no agreement (<0.20), insignificant agreement (0.0–0.20), moderate agreement (0.21–0.40), most part agreement (0.41–0.60), significant agreement (0.61–0.80), and excellent agreement (0.81–1.00).

Results

72 ultrasound-guided and 14 nerve-stimulator guided block videos, a total of 86 videos out of 374 were evaluated in YouTube. A total of 288 videos were excluded for the following reasons: 8 non English, 288 not on-topic, 36 repeated and 16 too long/short. The exclusion criteria and numbers for the whole evaluation process can be seen in Table 3 and Fig. 1.
Table 3

Video selection.

Total number of videosNon-EnglishOff-TopicToo long/shortRepeated videosSelected videos
Interscalene B. (Nerve Stimulator guided)31125113
Interscalene B.(Ultrasound guided)412134121
Supraclavicular B. (Nerve Stimulator guided)34128014
Supraclavicular B. (Ultrasound guided)653372320
Infraclavicular B. (Nerve Stimulator guided)37127171
Infraclavicular B. (Ultrasound guided)6002911515
Axillary B.(Nerve Stimulator guided)20013106
Axillary B. (Ultrasound guided)860566816

B., block.

Figure 1

Flow chart of the study.

Video selection. B., block. Flow chart of the study.

Results of Q1

The final Q1 scores for 14 PNS video and 72 USG guided videos are presented in Figure 2, Figure 3 respectively. As can be seen in the heat map graphics, the first videos in every watch list of each block technique had greater scores than subsequent videos and the last videos of the watch lists had the lower scores.
Figure 2

Distribution of scores for the brachial plexus nerve block videos performed with nerve stimulator.

Figure 3

Distribution of scores for the ultrasound guided brachial plexus nerve block videos.

Distribution of scores for the brachial plexus nerve block videos performed with nerve stimulator. Distribution of scores for the ultrasound guided brachial plexus nerve block videos. When we evaluate the average scores for each question in Q1, in ultrasound-guided videos the least score was 1.38 (median = 3; 25th percentile = 1; 75th percentile = 1) in Q1–05 related to the information on possible complications, and the greatest value was 3.30 (median = 3, 25th percentile = 2, 75th percentile = 5) in Q1–13 related to sono-anatomy (Table 4). For the videos with nerve stimulator, the lowest score was 1.64 (median = 1; 25th percentile = 1; 75th percentile = 2) in Q1–07, which was related to the nerve stimulator device and needle choice, and the highest was 3.60 (median = 4, 25th percentile = 3, 75th percentile = 5) in Q1–12 related to the information on muscle twitches (Table 5).
Table 4

Question based evaluation of the scores for ultrasound guided brachial plexus block videos in Q1.

USG video scores by question
Question25th percentileMeanMedian75th percentile
Q1323.335
Q0423.0335
Q0312.734
Q1412.734
Q1812.624
Q1512.624
Q1712.424
Q0111.912
Q1611.812
Q0811.812
Q0711.612
Q0611.611
Q0211.511
Q0911.411
Q0511.411
Table 5

Question based evaluation of the scores for nerve stimulator guided brachial plexus block videos in Q1.

NS video scores by question
Question25th percentileMeanMedian75th percentile
Q1233.645
Q0323.134
Q1123.134
Q0112.524
Q0812.433
Q0212.324
Q0912.323.25
Q0412.323.25
Q0612.022
Q0511.612
Q0711.612
Question based evaluation of the scores for ultrasound guided brachial plexus block videos in Q1. Question based evaluation of the scores for nerve stimulator guided brachial plexus block videos in Q1. According to the result of question 10 in Q1, which evaluates nerve stimulator usage, in 11 of 72 Ultrasound-guided videos nerve-stimulator was combined with ultrasound to demonstrate muscle twitches, however only 4 videos had information about lowest safe threshold for electrical impulses. A total of 26 videos were accompanied by a “written only” explanation and seven videos provided no description (written or verbal) of the procedure being undertaken. Regarding the ultrasound-guided videos, 36 sonographic recordings of successful interventions were shared and 3 videos were animation. Only 17 videos referred to whether the block was performed using an out-of-plane or in-plane technique. In Q1 Ultrasound-guided interscalene videos had higher scores compared to the other ultrasound-guided blocks. Ultrasound-guided axillary blocks had lowest scores in Q1. Videos of axillary blocks with nerve stimulator had highest scores however, supraclavicular blocks with nerve stimulator had the lowest average scores in Q1 (Table 6).
Table 6

Mean scores of the videos in Q1.

Mean score for nerve stimulator videosMean score for ultrasound videos
Axillary B.2.571.84
Infraclavicular B.3.111.86
Interscalene B.2.812.51
Supraclavicular B.1.822.25

B., blocks.

Mean scores of the videos in Q1. B., blocks. IRR was examined for Q1, Kappa score was between 0.81 and 1.00 for 28 videos (excellent agreement), 0.61–0.80 (significant agreement) for 8 videos, 0.41–0.60 (most part agreement) for 32 videos and 0.21–0.40 (moderate agreement) for 18 videos. The greatest and least Kappa scores were 1.00 and 0.23 respectively for Q1. Ten videos were scored as ‘unsatisfactory’ for all questions resulting in Kappa scores of 1.0 for these videos.

Results of Q2

The results of Q2, which reflects the preparation and generic video quality of videos as educational material, can be seen in Table 7, with 65.3% of Ultrasound-guided videos and 42.8% of blocks with nerve-stimulator having worse than satisfactory scores. Only 5 Ultrasound-guided videos and 1 classical nerve block video had outstanding results.
Table 7

Preparation and generic video quality of the videos (Q2).

EvaluationScoreUSG videos
NS videos
nPercentnPercent
Unsatisfactory0–131926.417.1
Poor14–272838.9535.7
Satisfactory28–411115.3535.7
Good42–54912.5214.3
Outstanding56–7056.917.1
Preparation and generic video quality of the videos (Q2).

Discussion

Since the core aspect of ultrasound-guided nerve block education is “visual” in nature, using videos for regional anesthesia education can be crucially beneficial for trainees. How to produce reliable visual materials and incorporate them into regional anesthesia education is, in itself, an important issue for medical practice. Therefore, there do exist regional anesthesia videos which have been created by professional institutions with high accuracy, quality and credibility for use in regional anesthesia training, requiring extensive planning and careful execution. As a complementary part of this visual guidance recording regional anesthesia interventions and using them to recreate visual feedback videos can also be extremely useful for self-assessments in regional anesthesia education. Even the qualified residents in regional anesthesia can use any of these videos as a source of information when they lose familiarity with a specific regional anesthesia block technique but still have to proceed with the technique.7, 8, 16, 17, 18 According to the results of the current study, the most rated videos were prepared by professional institutions, and in these videos the sonographic image and positioning of ultrasound probe shared the same screen so that the viewer can observe the relationship between the alignment, tilting, pressure and rotation maneuvers related to the USG view. These most highly rated videos included many of the aspects recommended by the new basic simulation assessment tool. This tool promised to improve the necessary skills for successful Needle Insertion Accuracy (NIA) for ultrasound-guided nerve blocks, such as “approach”, “alignment”, “movement”, “location” and “targeting”. All these skills require dimensional thinking, interpretation of a two dimensional screen and hand–eye coordination. Basically this new visual and video based regional anesthesia training device emphasizes the value of screening both the hand position and the ultrasound image simultaneously on the same screen. Thus, while creating visual educational materials for ultrasound-guided regional anesthesia blocks, this technical approach can empower the value of the videos. These videos which are systematic in their approach need to demonstrate target nerve location, direction of needle toward the target nerve and the correct method of administering the local anesthetic to the target area. Regulations over patient safety and quality standards have brought new issues for regional anesthesia education. To overcome these difficulties in regional block education, simulation based training methods have been described in recent years which allow trainees to gain skills such as rotation, alignment, tilting and targeting before they practice on real patients. It has been shown that these sonographic skills were performed more successfully by anesthesia trainees when they were provided with expert-guided feedback. Although we do not recommend YouTube videos as a learning tool as per our findings, publishing recordings of these modern regional anesthesia training videos together with explanatory feedback points may help watchers to understand the steps and errors made by trainees. These videos when made by reputable medical institutes might be quite beneficial for regional anesthesia trainees or residents. They may refresh their knowledge regarding ultrasound-guided nerve blocks simply by watching easily available YouTube videos. Addition to this, institutes and their followers can easily meet in YouTube's social platform with the help of these high-quality education videos. Thus, regional anesthesia applicants may meet and interact in an alternative widely-used platform to share and expand their experiences. The safety standard of procedure application in the videos was ambiguous too. In ultrasound-guided videos the lowest average score was in question 5 which asks the level of information on possible complications of the blocks, and this fact downgrades the reliability of videos in terms of safety. Among these same videos, 19 were without a sound recording, relying on written instructions and signs to deliver the presentation and therefore failing to give adequate information on equipment, preparation, drug doses, patient positioning, and sterilization. They are a contingent part of the apprenticeship style training method, a type of educational approach that can be found responsible to prevent regional anesthesia residents from receiving standardized education. Unfortunately, short videos were unable to score well on a majority of Q1 and Q2 questions. Only 6 of a total of 86 videos got outstanding quality scores and the length of these videos were over seven minutes. Therefore, it can be postulated that any visual education material prepared for regional anesthesia education should not be too short. Although some short videos contain valuable information and tips, they are often difficult to understand for someone who is new to the technique. Furthermore, in similar future studies based on visual material analysis, a longer “minimum time limit” can be applied as exclusion criteria depending on the targeted outcome. The final questions of the Q2 assessment, regarding whether or not the videos possessed the necessary detail for accurate re-enactment of the procedure in a clinical setting, received the lowest scores from the assessors. Only two videos received full points from Q2. Again this confirms that YouTube videos have poor overall preparation and could be inadequate as refresher material for the selected four upper body peripheral nerve block interventions. According to the results of this study, there were five times more ultrasound-guided BPNB videos on YouTube than those proceeding with conventional nerve stimulators among the videos examined for this study. This shows the increasing tendency for performing ultrasound guided nerve blocks and also indicates the level of demand for them as learning/sharing tools on social platforms. Health institutions, universities or regional anesthesia associations may contribute to the spread of more accountable and credible videos on YouTube for regional anesthesia which prioritize patient safety over commercial concerns, such that commercial institutions and individual health providers may be inspired by the setup of these videos to reproduce similar high quality videos. Thus the overall quality of regional anesthesia videos on YouTube, a platform of great public impact and popularity, may provide more accurate and trustworthy information.

Limitations

The limitations of this study should be considered while reviewing our data. First, each assessor viewed the selected videos in the same order according to the pre-determined watch list – ideally the assessors could have reviewed the videos in random order. Although assessors watched videos independently, video randomization would have prevented the repeated effect of one video on the judgment of the next from distorting results. Finally, we did not examine numbers of viewers for each video, which might be seen as a good predictor for the quality and “disseminative impact” of the video. However, former studies have revealed no correlation between the quality of videos and views/month and this fact dissuaded us from analyzing data on viewer counts.6, 9

Conclusion

The utility, scientific rigor and accountability of BPNB videos in YouTube does not correlate with YouTube's modernity and mass availability. The majority of the videos lack the systematic approach necessary to safely guide someone seeking information about the BPNB examined for this study. If professional institutions and universities publish more videos with predefined competencies on social media platforms like YouTube, they may present a good example for safe and successful interventions.

Conflicts of interest

The authors declare no conflicts of interest.
  17 in total

1.  Medical information on the Internet: Quality assessment of lumbar puncture and neuroaxial block techniques on YouTube.

Authors:  Bernhard Rössler; Daniel Lahner; Karl Schebesta; Astrid Chiari; Walter Plöchl
Journal:  Clin Neurol Neurosurg       Date:  2012-02-05       Impact factor: 1.876

2.  Do medical students watch video clips in eLearning and do these facilitate learning?

Authors:  Kalle Romanov; Anne Nevgi
Journal:  Med Teach       Date:  2007-06       Impact factor: 3.650

3.  Designing and implementing a comprehensive learner-centered regional anesthesia curriculum.

Authors:  Hugh M Smith; Sandra L Kopp; Adam K Jacob; Laurence C Torsher; James R Hebl
Journal:  Reg Anesth Pain Med       Date:  2009 Mar-Apr       Impact factor: 6.288

4.  Simulator for teaching hand-eye coordination during ultrasound-guided regional anaesthesia.

Authors:  S D Adhikary; A Hadzic; P M McQuillan
Journal:  Br J Anaesth       Date:  2013-11       Impact factor: 9.166

5.  The Altmetric Score: A New Measure for Article-Level Dissemination and Impact.

Authors:  N Seth Trueger; Brent Thoma; Cindy H Hsu; Daniel Sullivan; Lindsay Peters; Michelle Lin
Journal:  Ann Emerg Med       Date:  2015-05-23       Impact factor: 5.721

6.  The effect of metrics-based feedback on acquisition of sonographic skills relevant to performance of ultrasound-guided axillary brachial plexus block.

Authors:  O M A Ahmed; T Niessen; B D O'Donnell; A G Gallagher; D S Breslin; A DunnGalvin; G D Shorten
Journal:  Anaesthesia       Date:  2017-07-25       Impact factor: 6.955

Review 7.  [Anesthesia for medical students : A brief guide to practical anesthesia in adults with a web-based video illustration].

Authors:  S Mathis; O Schlafer; J Abram; J Kreutziger; P Paal; V Wenzel
Journal:  Anaesthesist       Date:  2016-12       Impact factor: 1.041

8.  Students' Learning Experiences from Didactic Teaching Sessions Including Patient Case Examples as Either Text or Video: A Qualitative Study.

Authors:  Kamilla Pedersen; Martin Holdgaard Moeller; Charlotte Paltved; Ole Mors; Charlotte Ringsted; Anne Mette Morcke
Journal:  Acad Psychiatry       Date:  2017-10-06

Review 9.  Information Seeking in Social Media: A Review of YouTube for Sedentary Behavior Content.

Authors:  Emily Knight; Brittany Intzandt; Alicia MacDougall; Travis J Saunders
Journal:  Interact J Med Res       Date:  2015-01-20

10.  Evaluation of the educational value of YouTube videos about physical examination of the cardiovascular and respiratory systems.

Authors:  Samy A Azer; Hala A Algrain; Rana A AlKhelaif; Sarah M AlEshaiwi
Journal:  J Med Internet Res       Date:  2013-11-13       Impact factor: 5.428

View more
  1 in total

1.  Objective validation of YouTube™ educational videos for the instruction of regional anesthesia nerve blocks: a novel approach.

Authors:  George L Tewfik; Adam N Work; Steven M Shulman; Patrick Discepola
Journal:  BMC Anesthesiol       Date:  2020-07-09       Impact factor: 2.217

  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.