| Literature DB >> 25954590 |
Silis Y Jiang1, Chunhua Weng1.
Abstract
Clinical trials are fundamental to the advancement of medicine but constantly face recruitment difficulties. Various clinical trial search engines have been designed to help health consumers identify trials for which they may be eligible. Unfortunately, knowledge of the usefulness and usability of their designs remains scarce. In this study, we used mixed methods, including time-motion analysis, think-aloud protocol, and survey, to evaluate five popular clinical trial search engines with 11 users. Differences in user preferences and time spent on each system were observed and correlated with user characteristics. In general, searching for applicable trials using these systems is a cognitively demanding task. Our results show that user perceptions of these systems are multifactorial. The survey indicated eTACTS being the generally preferred system, but this finding did not persist among all mixed methods. This study confirms the value of mixed-methods for a comprehensive system evaluation. Future system designers must be aware that different users groups expect different functionalities.Entities:
Year: 2014 PMID: 25954590 PMCID: PMC4419768
Source DB: PubMed Journal: AMIA Jt Summits Transl Sci Proc
Summary of The Representative Clinical Trial Search Engines (N=5)
| System | Key Functionalities |
|---|---|
| ClinicalTrials.gov | Offers both simple and advanced searches using string-based free-text search |
| Corengi | Matches patients up to clinical trials based on user-provided profile information |
| Dory/TrialX | Provides summaries and contact information to users based on question and answer sessions |
| eTACTS | Provides interactive tag cloud to allow users to select clinical terms to filter clinical trials |
| PatientsLikeMe | Provides a set of pre-formed search queries to help filter clinical trials |
Participant Diversity (N=11)
| Diversity Dimension | Proportion | Percentage of Sample (n=11) |
|---|---|---|
| Male | 10 | 90.9% |
| Clinicians (MD) | 3 | 27.3% |
| Database Administrators | 3 | 27.3% |
| Graduate Students | 3 | 27.3% |
| Clinical Research Coordinators | 2 | 18.2% |
| Experienced Users | 6 | 54.5% |
Time-Motion Analysis Task List
| Task | Description | Category |
|---|---|---|
| Typing Information | User enters initial search query information (i.e. diabetes mellitus type II) into the clinical trial search engine. | Preparatory |
| Answer Questions | User responds to a set of iterative questions (only used for Dory) | Preparatory |
| Refine Tag Cloud | User reviews and selects tag cloud options (only used for eTACTS) | Preparatory |
| Entering Profile | User enters information required to establish medical profile (only used for Corengi and PatientsLikeMe) | Preparatory |
| System Interactions | Time required by clinical trial search engine to process information and return response, or time spent by user on navigating the interface is also included. | Interaction |
| Result Review | User reviews the returned list of clinical trials. Participants may determine mock patient eligibility at this stage. | Review |
| Trial Review | User reviews a single clinical trial to determine whether mock patient would qualify for the study. | Review |
Average time spent by user groups per system (A: average time spent; B: average physician time spent; C: average non-physician time spent; D: difference between B and C; E: average experienced user time spent; F: average novice user time spent; G: difference between E and F. All measures are in minutes; ⬇ indicates the ranking column.
| System | A (⬇) | B | C | D=C-B | p-value (D) | E | F | G=E-F | p-value (G) |
|---|---|---|---|---|---|---|---|---|---|
| Dory/TrialX.com | 7.52 | 7.27 | 7.62 | 0.35 | 0.776 | 8.02 | 6.93 | 1.09 | 0.429 |
| Corengi | 8.19 | 8.05 | 8.22 | 0.22 | 1.00 | 7.86 | 8.51 | −0.65 | 0.310 |
| ClinicalTrials.gov | 8.44 | 7.74 | 8.71 | 0.97 | 0.63 | 7.30 | 9.82 | −2.52 | 0.247 |
| PatientsLikeMe | 9.78 | 7.99 | 10.45 | 2.46 | 0.776 | 8.22 | 11.65 | −3.43 | 0.126 |
| eTACTS | 9.91 | 5.33 | 11.62 | 11.06 | 8.52 | 2.54 | 0.792 | ||
Average time spent per user per task group by system (all measures are in minutes; ⬇ indicates the ranking column).
| System | Interaction | Preparatory | Review | Other | Total (⬇) |
|---|---|---|---|---|---|
| Dory/TrialX.com | 0.39 | 2.83 | 3.97 | 0.33 | 7.52 |
| Corengi | 1.23 | 2.84 | 3.81 | 0.31 | 8.19 |
| ClinicalTrials.gov | 0.42 | 0.39 | 6.79 | 0.84 | 8.44 |
| PatientsLikeMe | 2.42 | 1.22 | 5.46 | 0.68 | 9.78 |
| eTACTS | 0.45 | 2.29 | 6.24 | 0.93 | 9.91 |
Other represents tasks not originally intended to be recorded, such as soliciting for help or asking for clarification.
Average rating for search engines, which are ordered from left to right by ease of use using a 5-scale Likert survey (1: most preferred; 5: least preferred, A = eTACTS, B = Dory/TrialX, C = ClinicalTrials.gov, D = PatientsLikeMe, E = Corengi)
| Aspect | A | B | C | D | E | P-value |
|---|---|---|---|---|---|---|
| Ease of entering information | 2.42 | 3.58 | 2.92 | 3.00 | 4.00 | 0.172 |
| Provided most search guidance | 3.00 | 2.75 | 3.92 | 2.50 | 3.45 | 0.166 |
| Ease of site navigation | 1.75 | 2.83 | 2.17 | 3.17 | 3.36 | 0.173 |
| Ease of use with no prior | 2.08 | 2.75 | 2.92 | 3.33 | 3.18 | 0.351 |